Merge branch 'current' into next

This commit is contained in:
Paulus Schoutsen 2017-02-04 20:49:32 -08:00
commit d3627d6af7
54 changed files with 1111 additions and 372 deletions

View file

@ -15,8 +15,6 @@ ha_release: pre 0.7
The `mjpeg` camera platform allows you to integrate IP cameras which are capable to stream their video with MJPEG into Home Assistant.
Home Assistant will serve the images via its server, making it possible to view your IP camera's while outside of your network. The endpoint is `/api/camera_proxy/camera.[name]?time=[timestamp]`.
To enable this camera in your installation, add the following to your `configuration.yaml` file:
```yaml

View file

@ -60,6 +60,8 @@ device_tracker:
- 10.0.0.2
- 10.0.0.15
```
In the above example, Nmap will be call with the process:
`nmap -oX - 192.168.1.1/24 10.0.0.2 10.0.0.15 -F --host-timeout 5s`
An example of how the Nmap scanner can be customized:

View file

@ -103,6 +103,11 @@ You can verify that the `emulated_hue` component has been loaded and is respondi
- `http://<HA IP Address>:8300/description.xml` - This URL should return a descriptor file in the form of an XML file.
- `http://<HA IP Address>:8300/api/pi/lights` - This will return a list of devices, lights, scenes, groups, etc.. that `emulated_hue` is exposing to Alexa.
An additional step is required to run Home Assistant as non-root user and use port 80 when using the AiO script. Execute the following command to allow `emulated_hue` to use port 80 as non-root user.
```bash
sudo setcap 'cap_net_bind_service=+ep' /srv/homeassistant/homeassistant_venv/bin/python3
```
### {% linkable_title License %}

View file

@ -30,14 +30,16 @@ Configuration variables:
### {% linkable_title Raspbian Debian Jessie Lite Installations %}
To get the binary on Raspbian Debian Jessie Lite on a RPi you need to perform the following:
```
$ sudo apt-get install libav-tools
```
This will get a forked version of ffmpeg called avconv, once this is installed you need to use the following in the configuration:
```bash
$ sudo echo "deb http://ftp.debian.org/debian jessie-backports main" >> /etc/apt/sources.list
$ sudo apt-get update
$ sudo apt-get -t jessie-backports install ffmpeg
```
We can use now following in the configuration:
```
ffmpeg:
ffmpeg_bin: /usr/bin/avconv
ffmpeg_bin: /usr/bin/ffmpeg
```
### {% linkable_title Troubleshooting %}

View file

@ -27,16 +27,23 @@ The computer running Home Assistant must support CEC, and of course be connected
#### {% linkable_title Symlinking into virtual environment %}
Create a symlink to the `cec` installation.
Create a symlink to the `cec` installation. Keep in mind different installation methods will result in different locations of cec.
```bash
$ ln -s /usr/local/lib/python3.4/dist-packages/cec /path/to/your/venv/lib/python3.4/site-packages
$ ln -s /path/to/your/installation/of/cec /path/to/your/venv/lib/python3.4/site-packages
```
##### {% linkable_title Symlinking examples: %}
For the default virtual environment of a [HASSbian Image for Raspberry Pi](/getting-started/installation-raspberry-pi-image/) the command would be as follows.
```bash
$ ln -s /usr/local/lib/python3.4/dist-packages/cec /srv/homeassistant/lib/python3.4/site-packages
```
For the default virtual environment of a [Raspberry Pi All-In-One installation](/getting-started/installation-raspberry-pi-all-in-one/) the command would be as follows.
```bash
$ ln -s /usr/local/lib/python3.4/dist-packages/cec /srv/hass/hass_venv/lib/python3.4/site-packages
$ ln -s /usr/local/lib/python3.4/site-packages/cec /srv/homeassistant/homeassistant_venv/lib/python3.4/site-packages
```
For the default virtual environment of a [Manual installation](/getting-started/installation-raspberry-pi/) the command would be as follows.

View file

@ -14,6 +14,10 @@ Image processing enables Home Assistant to process images from [cameras](/compon
For interval control, use `scan_interval` in platform.
<p class='note'>
If you are running Home Assistant over SSL or from within a container, you will have to setup a base url inside the [http component](/components/http/).
</p>
## {% linkable_title ALPR %}
Alpr entities attribute have a vehicle counter `vehicles` and all found plates as `plates`.

View file

@ -8,7 +8,7 @@ comments: false
sharing: true
footer: true
logo: microsoft.png
ha_category: Image_Processing
ha_category: Image Processing
featured: false
ha_release: 0.37
---

View file

@ -32,10 +32,11 @@ light:
Configuration variables:
- **host** (*Optional*): IP address of the device, eg. 192.168.1.10. Required if not using the `discovery` component to discover Hue bridges.
- **allow_unreachable** (*Optional*): This will allow unreachable bulbs to report their state correctly. By default *name* from the device is used.
- **allow_unreachable** (*Optional*): (true/false) This will allow unreachable bulbs to report their state correctly.
- **filename** (*Optional*): Make this unique if specifying multiple Hue hubs.
- **allow_in_emulated_hue** (*Optional*): Enable this to block all Hue entities from being added to the `emulated_hue` component.
- **allow_hue_groups** (*Optional*): Enable this to stop Home Assistant from importing the groups defined on the Hue bridge.
- **allow_in_emulated_hue** (*Optional*): )true/false) Enable this to block all Hue entities from being added to the `emulated_hue` component.
- **allow_hue_groups** (*Optional*): (true/false) Enable this to stop Home Assistant from importing the groups defined on the Hue bridge.
### {% linkable_title Using Hue Groups in Home Assistant %}

View file

@ -14,6 +14,8 @@ ha_release: 0.32
The `yeelight` light platform allows you to control your Yeelight Wifi bulbs with Home Assistant.
### {% linkable_title Example configuration %}
To enable those lights, add the following lines to your `configuration.yaml` file:
```yaml
@ -22,16 +24,26 @@ light:
- platform: yeelight
devices:
192.168.1.25:
name: Front Door
192.168.1.13:
name: Living Room
transition: 1000
use_music_mode: True (defaults to False)
save_on_change: False (defaults to True)
192.168.1.13:
name: Front Door
```
Configuration variables:
- **ip** (*Required*): IP(s) of your wifi bulbs
- **name** (*Optional*): A friendly name for the device.
- **transition** (*Optional*, default 350): Smooth transitions over time (in ms).
- **use_music_mode** (*Optional*, default False): Enable music mode.
- **save_on_change** (*Optional*, default True): Saves the bulb state when changed from Home Assistant.
#### {% linkable_title Music mode %}
Per default the bulb limits the amount of requests per minute to 60, a limitation which can be bypassed by enabling the music mode. In music mode the bulb is commanded to connect back to a socket provided by the component and it tries to keep the connection open, which may not be wanted in all use-cases.
### {% linkable_title Initial setup %}
<p class='note'>
Before trying to control your light through Home Assistant, you have to setup your bulb using Yeelight app. ( [Android](https://play.google.com/store/apps/details?id=com.yeelight.cherry&hl=fr), [IOS](https://itunes.apple.com/us/app/yeelight/id977125608?mt=8) ).
In the bulb property, you have to enable "Developer Mode" Developer mode may only be available with the latest firmware installed on your bulb. Firmware can be updated in the application after connecting the bulb.
@ -39,7 +51,7 @@ Determine your bulb ip (using router, software, ping ...)
</p>
<p class='note warning'>
Tests are only made with a YLDP03YL model. Because it's the only hardware developer owns. If you have bugs with another kind of model, you could open an issue on [Home Assistant Github](https://github.com/home-assistant/home-assistant)
This component is tested to work with models YLDP01YL and YLDP03YL. If you have a different model and it is working please let us know.
</p>

View file

@ -24,3 +24,6 @@ The requirement is that you have setup [Wink](/components/wink/).
- Schlage
- Generic Z-wave
<p class='note'>
If supported by your lock, a binary sensor will be created for each user key code you have defined. These key codes will turn on when the code is entered and automatically turn off after a few seconds.
</p>

View file

@ -1,7 +1,7 @@
---
layout: page
title: "Microsoft Face"
description: "Instructions how to integrate Microsoft Face component into Home Assistant."
description: "Instructions on how to integrate Microsoft Face component into Home Assistant."
date: 2017-01-25 00:00
sidebar: true
comments: false
@ -12,9 +12,9 @@ ha_category: Hub
ha_release: "0.37"
---
The `microsoft_face` component platform is the main component for Microsoft Azure Cognitive service [Face](https://www.microsoft.com/cognitive-services/en-us/face-api). All data are in a own private instance in the Azure cloud.
The `microsoft_face` component platform is the main component for Microsoft Azure Cognitive service [Face](https://www.microsoft.com/cognitive-services/en-us/face-api). All data are stored in your own private instance in the Azure cloud.
You need an API key which is free but requires a [Azure registration](https://azure.microsoft.com/de-de/free/) with your microsoft ID. The free resource (*F0*) is limit to 30k requests in a month and 20 per minute. If you don't want use a the Azure cloud, you can also get a API key with registration on [cognitive-services](https://www.microsoft.com/cognitive-services/en-us/subscriptions) but they need to recreate all 90 days.
You need an API key which is free but requires a [Azure registration](https://azure.microsoft.com/de-de/free/) using your microsoft ID. The free resource (*F0*) is limited to 20 requests per minute and 30k requests in a month. If you don't want to use the Azure cloud, you can also get an API key by registering with [cognitive-services](https://www.microsoft.com/cognitive-services/en-us/subscriptions). However, all keys on cognitive services must be recreated every 90 days.
To enable the Microsoft Face component, add the following lines to your `configuration.yaml`:
@ -31,9 +31,9 @@ Configuration variables:
### {% linkable_title Person and Groups %}
For most of the services you need to set up a group or a person. This limits the processing and detection to elements provided by group. Home Assistent creates for all group a entity and allow you to show the state, person and IDs directly on the frontend.
For most services, you need to set up a group or a person. This limits the processing and detection to elements provided by the group. Home Assistent creates an entty for all groups and allows you to show the state, person and IDs directly on the frontend.
For managing this feature, you have the following services. They can be called with the Frontend, a script, or the REST API.
The following services are available for managing this feature. They can be called via the Frontend, a script, or the REST API.
- *microsoft_face.create_group*
- *microsoft_face.delete_group*
@ -66,15 +66,15 @@ data:
camera_entity: camera.door
```
For the local image we need `curl`. The person ID is present in group entity as attribute.
For the local image we need `curl`. The `{personId}` is present in group entity as attribute.
```bash
$ curl -v -X POST "https://westus.api.cognitive.microsoft.com/face/v1.0/persongroups/{GroupName}/persons/{personId}/persistedFaces" \
-H "Ocp-Apim-Subscription-Key: YOUR_API_KEY" \
-H "Content-Type: application/octet-stream" --data "@/tmp/image.jpg"
-H "Content-Type: application/octet-stream" --data-binary "@/tmp/image.jpg"
```
After we done with changes on a group, we need train this group to make our AI fit to handle the new data.
After we're done with changes on a group, we need train this group to teach the AI how to handle the new data.
- *microsoft_face.train_group*

View file

@ -279,6 +279,13 @@ switch:
platform: mqtt
state_format: 'json:somekey[0].value'
```
It is also possible to extract JSON values by using a value template:
```yaml
switch:
platform: mqtt
value_template: '{% raw %}{{ value_json.somekey[0].value }}{% endraw %}'
```
More information about the full JSONPath syntax can be found [here][JSONPath syntax].

View file

@ -31,6 +31,7 @@ Configuration variables:
- **name** (*Optional*): Name of the command sensor.
- **unit_of_measurement** (*Optional*): Defines the unit of measurement of the sensor, if any.
- **value_template** (*Optional*): Defines a [template](/topics/templating/) to extract a value from the payload.
- **scan_interval** (*Optional*): Defines number of seconds for polling interval
## {% linkable_title Examples %}

View file

@ -60,28 +60,28 @@ switch 2:
timeout: 15
switches:
# Will work on most Phillips tvs:
tv:
friendly_name: "Phillips Tv"
tv_phillips:
friendly_name: "Phillips Tv Power"
command_on: 'JgAcAB0dHB44HhweGx4cHR06HB0cHhwdHB8bHhwADQUAAAAAAAAAAAAAAAA='
command_off: 'JgAaABweOR4bHhwdHB4dHRw6HhsdHR0dOTocAA0FAAAAAAAAAAAAAAAAAAA='
# Will work on most LG tvs
tv_lg:
friendly_name: "LG Tv"
friendly_name: "LG Tv Power"
command_on: 'JgBYAAABIJISExETETcSEhISEhQQFBETETcROBESEjcRNhM1EjcTNRMTERISNxEUERMSExE2EjYSNhM2EhIROBE3ETcREhITEgAFGwABH0oSAAwzAAEfShEADQU='
command_off: 'JgBYAAABIJISExETETcSEhISEhQQFBETETcROBESEjcRNhM1EjcTNRMTERISNxEUERMSExE2EjYSNhM2EhIROBE3ETcREhITEgAFGwABH0oSAAwzAAEfShEADQU='
tv_lg_HDMI1_HDMI2:
friendly_name: "LG Tv"
tv_lg_hdmi1_hdmi2:
friendly_name: "LG Tv HDMI12"
command_on: 'JgBIAAABIZMRExITEjYSExMRERURExEUEDkRNxEUEjYSNhM3ETcSNxITETgSNhI2ExMQExE4ETYSNxIUERMSExE4ETcRFBETEQANBQ=='
command_off: 'JgBQAAABJJMSEhISETgSEhITEBMSEhMSETcSNxMREjcSNxI3EjcSOBETERITNhM2EhITERM2EzcRNxI3ExISEhI3EjcRExETEgAFLQABJEoRAA0FAAAAAAAAAAA='
tv_lg_HDMI3:
friendly_name: "LG Tv"
tv_lg_hdmi3:
friendly_name: "LG Tv HDMI3"
command_on: 'JgBIAAABIZMSFBISETgRExEUERQQFBETEjcTNhMSETgRNxE3EjcROBM2ERMSFBE4ERMSNxM2EjUSFBE2ETgRExM2ExITEhATEwANBQ=='
tv_lg_AV1_AV2:
friendly_name: "LG Tv"
tv_lg_av1_av2:
friendly_name: "LG Tv AV12"
command_on: 'JgBIAAABIpQPFBITETgSEw8UEhQSEhEVDzgSOBAUETgQOQ84EjgRNxITETgSExA5EDgREhI3EhMROBMSEDkQFBETEjYTEhE4EQANBQ=='
command_off: 'JgBIAAABH5YPFBETETgUERAUEBURFBATETgROBEUETcSNxE4ETcSOBISEBUQFREUEjUSFBA5ETcRNxE4ETkQOBAUEjcRFRAUEQANBQ=='

View file

@ -36,6 +36,7 @@ switch:
sunset_colortemp: 3000
stop_colortemp: 1900
brightness: 200
disable_brightness_adjust: True
mode: xy
```
@ -48,6 +49,7 @@ Configuration variables:
- **start_colortemp** (*Optional*): The color temperature at the start. Defaults to `4000`.
- **sunset_colortemp** (*Optional*): The sun set color temperature. Defaults to `3000`.
- **stop_colortemp** (*Optional*): The color temperature at the end. Defaults to `1900`.
- **brightness** (*Optional*): The brightness of the lights. Calculated with `RGB_to_xy` by default. Setting to -1 disables brightness updates.
- **brightness** (*Optional*): The brightness of the lights. Calculated with `RGB_to_xy` by default.
- **disable_brightness_adjust** (*Optional*): If true, brightness will not be adjusted besides color temperature. Defaults to False.
- **mode** (*Optional*): Select how color temperature is passed to lights. Valid values are `xy` and `mired`. Defaults to `xy`.

View file

@ -71,5 +71,5 @@ switch:
For a check you can use the command line tools `mosquitto_pub` shipped with `mosquitto` to send MQTT messages. This allows you to operate your switch manually:
```bash
$ mosquitto_pub -h 127.0.0.1 -t home/bedroom/switch1set -m "ON"
$ mosquitto_pub -h 127.0.0.1 -t home/bedroom/switch1 -m "ON"
```