Merge branch 'current' into next
This commit is contained in:
commit
afdd04e844
63 changed files with 290 additions and 226 deletions
|
@ -258,7 +258,7 @@ Now say `Alexa ask homeassistant to run <some script>` and Alexa will run that s
|
|||
|
||||
## {% linkable_title Giving Alexa Some Personality%}
|
||||
|
||||
In the examples above, we told Alexa to say `OK` when she succesfully completed the task. This is effective but a little dull! We can again use [templates] to spice things up a little.
|
||||
In the examples above, we told Alexa to say `OK` when she successfully completed the task. This is effective but a little dull! We can again use [templates] to spice things up a little.
|
||||
|
||||
First create a file called `alexa_confirm.yaml` with something like the following in it (go on, be creative!):
|
||||
|
||||
|
|
|
@ -80,3 +80,7 @@ $ ffmpeg -i YOUR_INPUT -an -filter:v select=gt(scene\,0.1) -f framemd5 -
|
|||
```
|
||||
|
||||
If you are running into trouble with this sensor, please refer to the [troubleshooting section](/components/ffmpeg/#troubleshooting).
|
||||
|
||||
#### {% linkable_title Tipps %}
|
||||
|
||||
- Use motion only in a customer area with [crop filter](https://ffmpeg.org/ffmpeg-filters.html#crop): ```extra_arguments: -filter:v "crop=100:100:12:34"```
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
---
|
||||
layout: page
|
||||
title: "FFmpeg Camera"
|
||||
description: "Instructions how to integrate a Video fees with FFmpeg as cameras within Home Assistant."
|
||||
description: "Instructions on how to integrate a video feed via FFmpeg as a camera within Home Assistant."
|
||||
date: 2016-08-13 08:00
|
||||
sidebar: true
|
||||
comments: false
|
||||
|
@ -13,7 +13,7 @@ ha_release: 0.26
|
|||
---
|
||||
|
||||
|
||||
The `ffmpeg` platform allows you to use every video feed with [FFmpeg](http://www.ffmpeg.org/) as camera in Home Assistant. The input for ffmpeg need to support that could have multiple connection to source (input) in same time. For every user in UI and all 10 seconds (snapshot image) it make a new connection/reading to source. Normally that should never be a trouble only in strange selfmade constructs can be make mistakes.
|
||||
The `ffmpeg` platform allows you to use any video feed as a camera in Home Assistant via [FFmpeg](http://www.ffmpeg.org/). This video source must support multiple simultaenous reads, because for every concurrent Home Assistant user, a connection will be made to the source every 10 seconds. Normally this should not be a problem.
|
||||
|
||||
To enable your FFmpeg feed in your installation, add the following to your `configuration.yaml` file:
|
||||
|
||||
|
@ -26,13 +26,13 @@ camera:
|
|||
|
||||
Configuration variables:
|
||||
|
||||
- **input** (*Required*): A ffmpeg compatible input file, stream or feed.
|
||||
- **name** (*Optional*): This parameter allows you to override the name of your camera.
|
||||
- **extra_arguments** (*Optional*): Extra option they will pass to `ffmpeg`. i.e. image quality or video filter options.
|
||||
- **input** (*Required*): An FFmpeg-compatible input file, stream, or feed.
|
||||
- **name** (*Optional*): Override the name of your camera.
|
||||
- **extra_arguments** (*Optional*): Extra options to pass to `ffmpeg`, e.g. image quality or video filter options.
|
||||
|
||||
### {% linkable_title Image quality %}
|
||||
|
||||
You can control the `image quality` with [`extra_arguments`](https://www.ffmpeg.org/ffmpeg-codecs.html#jpeg2000) `-q:v 2-32` or with lossless option `-pred 1`.
|
||||
|
||||
|
||||
If you are running into trouble with this sensor, please refer to this [Troubleshooting section](/components/ffmpeg/#troubleshooting).
|
||||
If you are running into trouble with this sensor, please refer to the [Troubleshooting section](/components/ffmpeg/#troubleshooting).
|
||||
|
|
|
@ -35,7 +35,7 @@ Configuration variables:
|
|||
- **ssh_key** (*Optional*): The path to your SSH private key file associated with your given admin account (instead of password).
|
||||
|
||||
<p class='note warning'>
|
||||
You need to enable telnet on your router if you choose to use `protocol: telnet`.
|
||||
You need to [enable telnet](https://www.asus.com/support/faq/1005449/) on your router if you choose to use `protocol: telnet`.
|
||||
</p>
|
||||
|
||||
See the [device tracker component page](/components/device_tracker/) for instructions how to configure the people to be tracked.
|
||||
|
|
|
@ -15,6 +15,11 @@ ha_release: pre 0.7
|
|||
|
||||
The `tplink` platform allows you to detect presence by looking at connected devices to a [TP-Link](https://www.tp-link.com) device. This includes the ArcherC9 line.
|
||||
|
||||
<p class='note'>
|
||||
TP-Link devices typically only allow one login at a time to the admin console. This component will count torwards your one allowed login. Depending on how aggressively you configure device_tracker you may not be able to access the admin console of your TP-Link device without first stopping Home Assistant (and waiting a few minutes for the session to timeout) before you'll be able to login.
|
||||
</p>
|
||||
|
||||
|
||||
```yaml
|
||||
# Example configuration.yaml entry
|
||||
device_tracker:
|
||||
|
|
|
@ -13,7 +13,7 @@ ha_release: 0.14
|
|||
---
|
||||
|
||||
|
||||
This platform allows you to detect presence by looking at connected devices to a [Ubiquiti](http://ubnt.com/) [Unifi](https://www.ubnt.com/enterprise/#unifi) controller.
|
||||
This platform allows you to detect presence by looking at devices connected to a [Ubiquiti](http://ubnt.com/) [Unifi](https://www.ubnt.com/enterprise/#unifi) controller.
|
||||
|
||||
To use this device tracker in your installation, add the following to your `configuration.yaml` file:
|
||||
|
||||
|
|
|
@ -41,7 +41,7 @@ Configuration variables:
|
|||
- `script`
|
||||
- `scene`
|
||||
|
||||
- **expose_by_default** (*Optional*): Whether or not entities should be exposed via the bridge by default instead of explicitly (see the 'echo' attribute later on). If not specified, this defaults to true.
|
||||
- **expose_by_default** (*Optional*): Whether or not entities should be exposed via the bridge by default instead of explicitly (see the 'emulated_hue' customization below). If not specified, this defaults to true.
|
||||
|
||||
- **exposed_domains** (*Optional*): The domains that are exposed by default if `expose_by_default` is set to true. If not specified, this defaults to the following list:
|
||||
- `switch`
|
||||
|
|
|
@ -11,10 +11,10 @@ logo: ffmpeg.png
|
|||
ha_category: Hub
|
||||
---
|
||||
|
||||
It allow other Home-Assistant components to process video/audio streams. It need a ffmpeg binary in your system path. It support all ffmpeg version since 3.0.0. If you have a older version, please update.
|
||||
The FFmpeg component allows other Home Assistant components to process video and audio streams. This component supports all FFmpeg versions since 3.0.0; if you have a older version, please update.
|
||||
|
||||
<p class='note'>
|
||||
You need a `ffmpeg` binary in your system path. On Debain 8 or Raspbian (Jessie) you can install it from backports. If you want Hardware support on a Raspberry Pi you need to build from source by yourself. Windows binary are avilable on the [FFmpeg](http://www.ffmpeg.org/) website.
|
||||
You need the `ffmpeg` binary in your system path. On Debian 8 or Raspbian (Jessie) you can install it from [debian-backports](https://backports.debian.org/Instructions/). If you want hardware acceleration support on a Raspberry Pi, you will need to build from source by yourself. Windows binaries are avilable on the [FFmpeg](http://www.ffmpeg.org/) website.
|
||||
</p>
|
||||
|
||||
To set it up, add the following information to your `configuration.yaml` file:
|
||||
|
@ -25,20 +25,20 @@ ffmpeg:
|
|||
|
||||
Configuration variables:
|
||||
|
||||
- **ffmpeg_bin** (*Optional*): Set the ffmpeg binary (eg. `/usr/bin/ffmpeg`). Default 'ffmpeg'.
|
||||
- **run_test** (*Optional*): Check if `input` is usable by ffmpeg. Default True.
|
||||
- **ffmpeg_bin** (*Optional*): Default 'ffmpeg'. The name or path to the `ffmpeg` binary.
|
||||
- **run_test** (*Optional*): Default True. Check if `input` is usable by ffmpeg.
|
||||
|
||||
### {% linkable_title Troubleshooting %}
|
||||
|
||||
In most of case, `ffmpeg` autodetect all needed options to read a video/audio stream or file. But it is possible in rare cases that's needed to set a option to help `ffmpeg`. Per default `ffmpeg` use 5 seconds to detect all options or abort.
|
||||
In most cases, `ffmpeg` automatically detects all needed options to read a video or audio stream or file. But it is possible in rare cases that you will need to set options to help `ffmpeg` out.
|
||||
|
||||
First check, if your stream playable by `ffmpeg` with (use option `-an` or `-vn` to disable video or audio stream):
|
||||
First check that your stream is playable by `ffmpeg` outside of Home Assistant with (use option `-an` or `-vn` to disable video or audio stream):
|
||||
|
||||
```
|
||||
$ ffmpeg -i INPUT -an -f null -
|
||||
```
|
||||
|
||||
Now you can see what going wrong. Following list could be help to solve your trouble:
|
||||
Now you should be able to see what is going wrong. The following list contains some common problems and solutions:
|
||||
|
||||
- `[rtsp @ ...] UDP timeout, retrying with TCP`: You need to set RTSP transport in the configuration with: `input: -rtsp_transport tcp -i INPUT`
|
||||
- `[rtsp @ ...] Could not find codec parameters for stream 0 (Video: ..., none): unspecified size`: FFmpeg need more data or time for autodetect. You can set the `analyzeduration` and/or `probesize` option, play with this value. If you know the needed value you can set it with: `input: -analyzeduration xy -probesize xy -i INPUT`. More information about that can be found on [FFmpeg](https://www.ffmpeg.org/ffmpeg-formats.html#Description).
|
||||
- `[rtsp @ ...] UDP timeout, retrying with TCP`: You need to set an RTSP transport in the configuration with: `input: -rtsp_transport tcp -i INPUT`
|
||||
- `[rtsp @ ...] Could not find codec parameters for stream 0 (Video: ..., none): unspecified size`: FFmpeg needs more data or time for autodetection (the default is 5 seconds). You can set the `analyzeduration` and/or `probesize` options to experiment with giving FFmpeg more leeway. If you find the needed value, you can set it with: `input: -analyzeduration xy -probesize xy -i INPUT`. More information about this can be found [here](https://www.ffmpeg.org/ffmpeg-formats.html#Description).
|
||||
|
|
|
@ -7,7 +7,7 @@ sidebar: true
|
|||
comments: false
|
||||
sharing: true
|
||||
footer: true
|
||||
logo:
|
||||
logo: graphite.png
|
||||
ha_category: History
|
||||
ha_release: 0.13
|
||||
---
|
||||
|
|
|
@ -15,7 +15,7 @@ featured: true
|
|||
|
||||
Philips Hue support is integrated into Home Assistant as a light platform. The preferred way to setup the Philips Hue platform is by enabling the [the discovery component](/components/discovery/).
|
||||
|
||||
Once discovered, locate "configurator.philips_hue" in the entities list ( < > ) and add it to configuration.yaml. Restart home assistant so that it is visible in the home assistant dashboard. Once home assistant is restarted, locate and click on configurator.philips_hue to bring up the intitiation dialog. This will prompt you to press the Hue button to register the Hue hub in home assistant. Once complete, the configurator entity can be removed from configuration.yaml.
|
||||
Once discovered, locate "configurator.philips_hue" in the entities list ( < > ) and add it to configuration.yaml. Restart home assistant so that it is visible in the home assistant dashboard. Once home assistant is restarted, locate and click on configurator.philips_hue to bring up the initiation dialog. This will prompt you to press the Hue button to register the Hue hub in home assistant. Once complete, the configurator entity can be removed from configuration.yaml.
|
||||
|
||||
Restarting home assistant once more should result in the Hue lights listed as "light" entities. Add these light entities to configuration.yaml and restart home assistant once more to complete the installation.
|
||||
|
||||
|
|
|
@ -7,11 +7,14 @@ sidebar: true
|
|||
comments: false
|
||||
sharing: true
|
||||
footer: true
|
||||
logo: hyperion.png
|
||||
ha_category: Light
|
||||
ha_release: 0.7.6
|
||||
---
|
||||
|
||||
This platform allows you to integrate your [Hyperion](https://github.com/tvdzwan/hyperion/wiki) into Home Assistant.
|
||||
This platform allows you to integrate your [Hyperion](https://hyperion-project.org/wiki) into Home Assistant.
|
||||
|
||||
Hyperion is an opensource Ambilight implementation which runs on many platforms.
|
||||
|
||||
```yaml
|
||||
# Example configuration.yaml entry
|
||||
|
|
|
@ -110,16 +110,15 @@ The LIRC component fires `ir_command_received` events on the bus. You can captur
|
|||
```yaml
|
||||
# Example configuration.yaml automation entry
|
||||
automation:
|
||||
- alias: Off on Remote
|
||||
trigger:
|
||||
platform: event
|
||||
event_type: ir_command_received
|
||||
event_data:
|
||||
button_name: KEY_0
|
||||
action:
|
||||
service: homeassistant.turn_off
|
||||
entity_id: group.a_lights
|
||||
|
||||
- alias: Off on Remote
|
||||
trigger:
|
||||
platform: event
|
||||
event_type: ir_command_received
|
||||
event_data:
|
||||
button_name: KEY_0
|
||||
action:
|
||||
service: homeassistant.turn_off
|
||||
entity_id: group.a_lights
|
||||
```
|
||||
|
||||
The `button_name` data values (e.g. `KEY_0`) are set by you in the `.lircrc` file.
|
||||
|
|
|
@ -6,6 +6,7 @@ date: 2016-04-29 16:50
|
|||
sidebar: true
|
||||
comments: false
|
||||
sharing: true
|
||||
logo: logentries.png
|
||||
footer: true
|
||||
ha_category: "History"
|
||||
---
|
||||
|
|
|
@ -27,13 +27,28 @@ Steps to configure your Amazon Fire TV stick with Home Assistant:
|
|||
- Find Amazon Fire TV device IP:
|
||||
- From the main (Launcher) screen, select Settings.
|
||||
- Select System > About > Network.
|
||||
- `pip install firetv[firetv-server]` into a Python 2.x environment
|
||||
- If installed on Debian Jessie then the libssl-dev and python-dev packages are needed. Install them with `apt-get install libssl-dev python-dev`
|
||||
- The following commands must be run in a Python 2.x environment. They will allow the component to function in an Ubuntu 16.04/Hassbian enviorment.
|
||||
- `apt-get install swig libssl-dev python-dev libusb-1.0-0`
|
||||
- `pip install flask`
|
||||
- `pip install https://pypi.python.org/packages/source/M/M2Crypto/M2Crypto-0.24.0.tar.gz`
|
||||
- `pip install firetv[firetv-server]`
|
||||
- `firetv-server -d <fire tv device IP>:5555`, background the process
|
||||
- Navigate to http://localhost:5556/devices/list
|
||||
- You will get an output similar to below:
|
||||
```json
|
||||
{
|
||||
"devices": {
|
||||
"default": {
|
||||
"host": "192.168.1.153:5555",
|
||||
"state": "play"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
- The `"default"` above is the device name you will need to use for your `configuration.yaml`
|
||||
- Configure Home Assistant as follows:
|
||||
|
||||
|
||||
To add FireTV to your installation, add the following to your `configuration.yaml` file:
|
||||
To add FireTV to your installation, Note your device name, and add the following to your `configuration.yaml` file:
|
||||
|
||||
```yaml
|
||||
# Example configuration.yaml entry
|
||||
|
|
|
@ -24,30 +24,26 @@ To add a device to your installation, add the following to your `configuration.y
|
|||
```yaml
|
||||
# Example configuration.yaml entry
|
||||
media_player:
|
||||
platform: russound_rnet
|
||||
host: 192.168.1.10
|
||||
port: 1337
|
||||
name: Russound
|
||||
zones:
|
||||
1:
|
||||
name: Main Bedroom
|
||||
2:
|
||||
name: Living Room
|
||||
3:
|
||||
name: Kitchen
|
||||
4:
|
||||
name: Bathroom
|
||||
5:
|
||||
name: Dining Room
|
||||
6:
|
||||
name: Guest Bedroom
|
||||
sources:
|
||||
- name: Sonos
|
||||
- name: Sky+
|
||||
- name: iPod
|
||||
- name: Unused 1
|
||||
- name: Unused 2
|
||||
- name: Kodi
|
||||
- platform: russound_rnet
|
||||
host: 192.168.1.10
|
||||
port: 1337
|
||||
name: Russound
|
||||
zones:
|
||||
1:
|
||||
name: Main Bedroom
|
||||
2:
|
||||
name: Living Room
|
||||
3:
|
||||
name: Kitchen
|
||||
4:
|
||||
name: Bathroom
|
||||
5:
|
||||
name: Dining Room
|
||||
6:
|
||||
name: Guest Bedroom
|
||||
sources:
|
||||
- name: Sonos
|
||||
- name: Sky+
|
||||
```
|
||||
|
||||
Configuration variables:
|
||||
|
|
|
@ -21,30 +21,30 @@ A Universal Media Player is created in `configuration.yaml` as follows.
|
|||
```yaml
|
||||
# Example configuration.yaml entry
|
||||
media_player:
|
||||
platform: universal
|
||||
name: MEDIA_PLAYER_NAME
|
||||
children:
|
||||
- media_player.CHILD_1_ID
|
||||
- media_player.CHILD_2_ID
|
||||
commands:
|
||||
turn_on:
|
||||
service: SERVICE
|
||||
data: SERVICE_DATA
|
||||
turn_off:
|
||||
service: SERVICE
|
||||
data: SERVICE_DATA
|
||||
volume_up:
|
||||
service: SERVICE
|
||||
data: SERVICE_DATA
|
||||
volume_down:
|
||||
service: SERVICE
|
||||
data: SERVICE_DATA
|
||||
volume_mute:
|
||||
service: SERVICE
|
||||
data: SERVICE_DATA
|
||||
attributes:
|
||||
is_volume_muted: ENTITY_ID|ATTRIBUTE
|
||||
state: ENTITY_ID|ATTRIBUTE
|
||||
- platform: universal
|
||||
name: MEDIA_PLAYER_NAME
|
||||
children:
|
||||
- media_player.CHILD_1_ID
|
||||
- media_player.CHILD_2_ID
|
||||
commands:
|
||||
turn_on:
|
||||
service: SERVICE
|
||||
data: SERVICE_DATA
|
||||
turn_off:
|
||||
service: SERVICE
|
||||
data: SERVICE_DATA
|
||||
volume_up:
|
||||
service: SERVICE
|
||||
data: SERVICE_DATA
|
||||
volume_down:
|
||||
service: SERVICE
|
||||
data: SERVICE_DATA
|
||||
volume_mute:
|
||||
service: SERVICE
|
||||
data: SERVICE_DATA
|
||||
attributes:
|
||||
is_volume_muted: ENTITY_ID|ATTRIBUTE
|
||||
state: ENTITY_ID|ATTRIBUTE
|
||||
```
|
||||
|
||||
Configuration variables:
|
||||
|
|
|
@ -161,7 +161,7 @@ Home Assistant will automatically load the correct certificate if you connect to
|
|||
|
||||
### {% linkable_title Publish service %}
|
||||
|
||||
The MQTT component will register the service `publish` which allows publishing messages to MQTT topics. There are two ways of specifiying your payload. You can either use `payload` to hard-code a payload or use `payload_template` to specify a [template](/topics/templating/) that will be rendered to generate the payload.
|
||||
The MQTT component will register the service `publish` which allows publishing messages to MQTT topics. There are two ways of specifying your payload. You can either use `payload` to hard-code a payload or use `payload_template` to specify a [template](/topics/templating/) that will be rendered to generate the payload.
|
||||
|
||||
```json
|
||||
{
|
||||
|
|
|
@ -15,7 +15,7 @@ ha_release: pre 0.7
|
|||
|
||||
The `pushetta` notify platform uses [Pushetta](http://www.pushetta.com) to delivery notifications from Home Assistant to your devices.
|
||||
|
||||
To retrieve the API token, log into your account at (http://www.pushetta.com)[http://www.pushetta.com] and go to your **Dashboard**. Create a new channel by clicking on **Channels** and then **Add a Channel**.
|
||||
To retrieve the API token, log into your account at [http://www.pushetta.com](http://www.pushetta.com) and go to your **Dashboard**. Create a new channel by clicking on **Channels** and then **Add a Channel**.
|
||||
|
||||
To enable Pushetta notifications in your installation, add the following to your `configuration.yaml` file:
|
||||
|
||||
|
|
|
@ -20,7 +20,7 @@ The requirements are:
|
|||
- You need a [Telegram bot](https://core.telegram.org/bots). Please follow those [instructions](https://core.telegram.org/bots#botfather) to create one and get the token for your bot. Keep in mind that bots are not allowed to contact users. You need to make the first contact with your user. Meaning that you need to send a message to the bot from your user.
|
||||
- The `chat_id` of an user.
|
||||
|
||||
The quickest way to retrieve your `chat_id` is visiting [https://api.telegram.org/botYOUR_API_TOKEN/getUpdates](https://api.telegram.org/botYOUR_API_TOKEN/getUpdates).
|
||||
The quickest way to retrieve your `chat_id` is visiting [https://api.telegram.org/botYOUR_API_TOKEN/getUpdates](https://api.telegram.org/botYOUR_API_TOKEN/getUpdates) or to use `$ curl -X GET https://api.telegram.org/botYOUR_API_TOKEN/getUpdates`. Replace `YOUR_API_TOKEN` with your actual token.
|
||||
|
||||
The result set will include your chat ID as `id` in the `from` section:
|
||||
|
||||
|
@ -29,13 +29,14 @@ The result set will include your chat ID as `id` in the `from` section:
|
|||
"message":{"message_id":27,"from":{"id":123456789,"first_name":"YOUR_FIRST_NAME YOUR_NICK_NAME","last_name":"YOUR_LAST_NAME","username":"YOUR_NICK_NAME"},"chat":{"id":123456789,"first_name":"YOUR_FIRST_NAME YOUR_NICK_NAME","last_name":"YOUR_LAST_NAME","username":"YOUR_NICK_NAME","type":"private"},"date":1678292650,"text":"test"}}]}
|
||||
```
|
||||
|
||||
Another way to get your chat ID directly is described below:
|
||||
Another way to get your chat ID directly is described below. Start your Python interpreter from the command-line:
|
||||
|
||||
```python
|
||||
import telegram
|
||||
bot = telegram.Bot(token='YOUR_API_TOKEN')
|
||||
chat_id = bot.getUpdates()[-1].message.chat_id
|
||||
print(chat_id)
|
||||
$ python3
|
||||
>>> import telegram
|
||||
>>> bot = telegram.Bot(token='YOUR_API_TOKEN')
|
||||
>>> chat_id = bot.getUpdates()[-1].message.chat_id
|
||||
>>> print(chat_id)
|
||||
123456789
|
||||
```
|
||||
|
||||
|
|
|
@ -7,6 +7,7 @@ sidebar: true
|
|||
comments: false
|
||||
sharing: true
|
||||
footer: true
|
||||
logo: coinmarketcap.png
|
||||
ha_category: Finance
|
||||
ha_release: 0.28
|
||||
ha_iot_class: "Cloud Polling"
|
||||
|
|
|
@ -15,6 +15,10 @@ ha_release: 0.26
|
|||
|
||||
The `fastdotcom` sensor component uses the [Fast.com](https://fast.com/) web service to measure network bandwidth performance.
|
||||
|
||||
<p class='note'>
|
||||
Currently fast.com only supports measuring download bandwidth. If you want to measure bandwidth metrics other then download such as ping and upload, utilize the [speedtest](/components/sensor.speedtest) component.
|
||||
</p>
|
||||
|
||||
By default, it will run every hour. The user can change the update frequency in the config by defining the minute, hour, and day for a speedtest to run.
|
||||
|
||||
To add a Fast.com sensor to your installation, add the following to your `configuration.yaml` file:
|
||||
|
|
|
@ -39,6 +39,9 @@ sensor
|
|||
3 (you can also use larger values) measurements. This filters out single spikes. Median: 5 will also filter double spikes.
|
||||
If you never have problems with spikes, median=1 will work fine.
|
||||
- **monitored_conditions** (*Required*): The paramaters that should be monitored.
|
||||
- **timeout** (*Optional*): Define the timeout value in seconds when polling (defaults to 10 if not defined)
|
||||
- **retries** (*Optional*): Define the number of retries when polling (defaults to 2 if not defined)
|
||||
- **cache** (*Optional*): Define cache expiration value in seconds (defaults to 1200 if not defined)
|
||||
|
||||
Note that by default the sensor is only polled once every 15 minutes. This means with the median=3 setting, it will take as least 30 minutes before the sensor will report a value after a Home Assistant restart. As the values usually change very slowly, this isn't a big problem.
|
||||
Reducing polling intervals will have a negative effect on the battery life.
|
||||
|
|
|
@ -14,7 +14,7 @@ ha_release: 0.22
|
|||
|
||||
The `plex` sensor platform will monitor activity on a given [Plex Media Server](https://plex.tv/). It will create a sensor that shows the number of currently watching users as the state. If you click the sensor for more details it will show you who is watching what.
|
||||
|
||||
If your Plex server is on the same local network as Home Assistant, all you need to provide in the `configuration.yaml` is the host or IP address. If you want to access a remote Plex server, you must provide the Plex username, password, and optionally the server name of the remote Plex server. If no server name is given it will use the first server listed.
|
||||
If your Plex server is on the same local network as Home Assistant, all you need to provide in the `configuration.yaml` is the host or IP address. If you want to access a remote Plex server, you must provide the Plex username, password, and optionally the server name of the remote Plex server. If no server name is given it will use the first server listed. If you use the username and password, all servers in that account are monitored.
|
||||
|
||||
If you want to enable the plex sensor, add the following lines to your `configuration.yaml`:
|
||||
|
||||
|
|
|
@ -6,6 +6,7 @@ date: 2016-01-25 08:00
|
|||
sidebar: true
|
||||
comments: false
|
||||
sharing: true
|
||||
logo: statsd.png
|
||||
footer: true
|
||||
ha_category: "History"
|
||||
ha_release: 0.12
|
||||
|
|
|
@ -108,5 +108,5 @@ switch:
|
|||
value_template: {% raw %}'{{ value == "1" }}'{% endraw %}
|
||||
```
|
||||
|
||||
- Replace admin and password with an "Admin" priviledged Foscam user
|
||||
- Replace admin and password with an "Admin" privileged Foscam user
|
||||
- Replace ipaddress with the local IP address of your Foscam
|
||||
|
|
|
@ -7,6 +7,7 @@ sidebar: true
|
|||
comments: false
|
||||
sharing: true
|
||||
footer: true
|
||||
logo: upnp.png
|
||||
ha_category: "Other"
|
||||
ha_release: 0.18
|
||||
---
|
||||
|
|
|
@ -13,7 +13,7 @@ ha_category: Hub
|
|||
|
||||
The [Vera](http://getvera.com) hub is a controller mainly connecting to Z-Wave devices.
|
||||
|
||||
Switches, Lights (inc Dimmers), Locks, Sensors and Binary sensors are supported - and will be automaticaly added when HA connects to your Vera controller.
|
||||
Switches, Lights (inc Dimmers), Locks, Sensors and Binary sensors are supported - and will be automatically added when HA connects to your Vera controller.
|
||||
|
||||
To use Vera devices in your installation, add the following to your configuration.yaml file using the IP and port number of your Vera controller:
|
||||
|
||||
|
|
|
@ -7,6 +7,7 @@ sidebar: true
|
|||
comments: false
|
||||
sharing: true
|
||||
footer: true
|
||||
logo: avahi.png
|
||||
ha_category: "Other"
|
||||
ha_release: 0.18
|
||||
---
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue