Merge branch 'current' into next

This commit is contained in:
Paulus Schoutsen 2016-10-21 22:52:07 -07:00
commit be8ea1aef1
77 changed files with 1151 additions and 293 deletions

View file

@ -0,0 +1,125 @@
---
layout: page
title: "Home Assistant Database"
description: "Details about the database which Home Assistant is using."
date: 2016-10-10 10:00
sidebar: true
comments: false
sharing: true
footer: true
---
The default database that is used for Home Assistant is [SQLite](https://www.sqlite.org/) and is stored in your [configuration directory](/getting-started/configuration/), eg. `<path to config dir>/.homeassistant/home-assistant_v2.db`. You will need an installation of `sqlite3`, the command-line for SQLite database, or [DB Browser for SQLite](http://sqlitebrowser.org/) which provide an editor for executing SQL commands.
First load your database with `sqlite3`.
```bash
$ sqlite3 home-assistant_v2.db
SQLite version 3.13.0 2016-05-18 10:57:30
Enter ".help" for usage hints.
sqlite>
```
It helps to set some options to make the output better readable.
```bash
sqlite> .header on
sqlite> .mode column
```
You could also start `sqlite3` and attach the database later. Not sure what database you are working with? Check it, especially if you are going to delete data.
```bash
sqlite> .databases
seq name file
--- --------------- ----------------------------------------------------------
0 main /home/fab/.homeassistant/home-assistant_v2.db
```
### {% linkable_title Schema %}
Get all available tables from your current Home Assistant database.
```bash
sqlite> SELECT sql FROM sqlite_master;
-------------------------------------------------------------------------------------
CREATE TABLE events (
event_id INTEGER NOT NULL,
event_type VARCHAR(32),
event_data TEXT,
origin VARCHAR(32),
time_fired DATETIME,
created DATETIME,
PRIMARY KEY (event_id)
)
CREATE INDEX ix_events_event_type ON events (event_type)
CREATE TABLE recorder_runs (
run_id INTEGER NOT NULL,
start DATETIME,
"end" DATETIME,
closed_incorrect BOOLEAN,
created DATETIME,
PRIMARY KEY (run_id),
CHECK (closed_incorrect IN (0, 1))
)
CREATE TABLE states (
state_id INTEGER NOT NULL,
domain VARCHAR(64),
entity_id VARCHAR(64),
state VARCHAR(255),
attributes TEXT,
event_id INTEGER,
last_changed DATETIME,
last_updated DATETIME,
created DATETIME,
PRIMARY KEY (state_id),
FOREIGN KEY(event_id) REFERENCES events (event_id)
)
CREATE INDEX states__significant_changes ON states (domain, last_updated, entity_id)
CREATE INDEX states__state_changes ON states (last_changed, last_updated, entity_id)
CREATE TABLE sqlite_stat1(tbl,idx,stat)
```
To only show the details about the `states` table as we are using that one in the next examples.
```bash
sqlite> SELECT sql FROM sqlite_master WHERE type = 'table' AND tbl_name = 'states';
```
### {% linkable_title Query %}
The identification of the available columns in the table is done and we are now able to create a query. Let's list of your Top 10 entities.
```bash
sqlite> .width 30, 10,
sqlite> SELECT entity_id, COUNT(*) as count FROM states GROUP BY entity_id ORDER BY count DESC LIMIT 10;
entity_id count
------------------------------ ----------
sensor.cpu 28874
sun.sun 21238
sensor.time 18415
sensor.new_york 18393
cover.kitchen_cover 17811
switch.mystrom_switch 14101
sensor.internet_time 12963
sensor.solar_angle1 11397
sensor.solar_angle 10440
group.all_switches 8018
```
### {% linkable_title Delete %}
If you don't want to keep certain entities, you can delete them permanently.
```bash
sqlite> DELETE FROM states WHERE entity_id="sensor.cpu";
```
The `VACUUM` command cleans the your database.
```bash
sqlite> VACUUM;
```
For a more interactive way to work with the database or the create statistics, checkout our [Jupyther notebooks](http://nbviewer.jupyter.org/github/home-assistant/home-assistant-notebooks/blob/master/).

View file

@ -3,7 +3,7 @@ layout: page
title: "Entity component platform options"
description: "Shows how to customize polling interval for any component via configuration.yaml."
date: 2016-02-12 23:17 -0800
sidebar: true
sidebar: false
comments: false
sharing: true
footer: true

View file

@ -58,10 +58,6 @@ zwave:
config_path: /usr/local/share/python-openzwave/config
polling_interval: 10000
#zigbee:
# device: /dev/ttyUSB1
# baud: 115200
mqtt:
broker: 127.0.0.1
```
@ -115,10 +111,7 @@ This (large) sensor configuration gives us another example:
```yaml
### sensors.yaml
##############################################################
### METEOBRIDGE ####
##############################################################
### METEOBRIDGE #############################################
- platform: tcp
name: 'Outdoor Temp (Meteobridge)'
host: 192.168.2.82
@ -134,27 +127,14 @@ This (large) sensor configuration gives us another example:
payload: "Content-type: text/xml; charset=UTF-8\n\n"
value_template: "{% raw %}{{value.split (' ')[3]}}{% endraw %}"
unit: Percent
- platform: tcp
name: 'Outdoor Dewpoint (Meteobridge)'
host: 192.168.2.82
port: 5556
timeout: 6
payload: "Content-type: text/xml; charset=UTF-8\n\n"
value_template: "{% raw %}{{value.split (' ')[4] }}{% endraw %}"
unit: C
###################################
#### STEAM FRIENDS ####
##################################
#### STEAM FRIENDS ##################################
- platform: steam_online
api_key: [not telling]
accounts:
- 76561198012067051
##################################
#### TIME/DATE ####
##################################
#### TIME/DATE ##################################
- platform: time_date
display_options:
- 'time'
@ -165,12 +145,6 @@ This (large) sensor configuration gives us another example:
- platform: worldclock
time_zone: America/New_York
name: 'Ann Arbor'
- platform: worldclock
time_zone: Europe/Vienna
name: 'Innsbruck'
- platform: worldclock
time_zone: America/New_York
name: 'Ann Arbor'
```
You'll notice that this example includes a secondary parameter section (under the steam section) as well as a better example of the way comments can be used to break down files into sections.

View file

@ -0,0 +1,39 @@
---
layout: page
title: "Details about the web server"
description: "Use nmap to scan your Home Assistant instance."
date: 2016-10-06 08:00
sidebar: false
comments: false
sharing: true
footer: true
ha_category: Infrastructure
---
It was only a matter of time till the first queries for tools like [https://www.shodan.io](https://www.shodan.io/search?query=Home+Assistant) to search for Home Assistant instances showed up.
To get an idea about how your Home Assistant instance looks like for network scanner, you can use `nmap`. The `nmap` tool is already available if you are using the [nmap device tracker](/components/device_tracker/).
```yaml
$ nmap -sV -p 8123 --script=http-title,http-headers 192.168.1.3
Starting Nmap 7.12 ( https://nmap.org ) at 2016-10-06 10:01 CEST
Nmap scan report for 192.168.1.3 (192.168.1.3)
Host is up (0.00011s latency).
PORT STATE SERVICE VERSION
8123/tcp open http CherryPy wsgiserver
| http-headers:
| Content-Type: text/html; charset=utf-8
| Content-Length: 4309
| Connection: close
| Date: Thu, 06 Oct 2016 08:01:31 GMT
| Server: Home Assistant
|
|_ (Request type: GET)
|_http-server-header: Home Assistant
|_http-title: Home Assistant
Service detection performed. Please report any incorrect results at https://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in 6.70 seconds
```