Compare commits

...

96 Commits

Author SHA1 Message Date
SchiZzA e737fb16d3
Use configured WSLink add-on port for health/status endpoints. 2026-03-23 18:32:02 +01:00
SchiZzA 159d465db5
Improve sensor value parsing and add battery binary sensors.
- Introduce `to_int`/`to_float` helpers to safely handle None/blank payload values.
- Use the converters across weather/wslink sensor descriptions and simplify wind direction handling.
- Add low-battery binary sensor entities and definitions.
2026-03-23 18:21:05 +01:00
SchiZzA 9d5fafa8d0
Add configuration options for WSLink Addon port. 2026-03-23 18:14:46 +01:00
SchiZzA 63660006ea
Add health diagnostics coordinator and routing snapshot
Track ingress/forwarding status, expose detailed health sensors and translations, and include redacted diagnostics data.
2026-03-14 17:39:52 +01:00
SchiZzA 39b16afcbc
Update constants to more readable form. 2026-03-05 11:47:52 +01:00
SchiZzA f0554573ce
Make routes method-aware and update related tests
Include HTTP method in route keys and dispatch, and fix
Routes.show_enabled.
Update register_path to accept a HealthCoordinator and adjust router
stubs in tests. Update WindyPush tests to use response objects
(status/text)
and adapt related exception/notification expectations.
2026-03-04 07:53:26 +01:00
SchiZzA 995f607cf7
Improve Windy error handling and retry logic 2026-03-03 14:17:34 +01:00
SchiZzA 3e573087a2
Add multiple health sensors and device info
Introduce HealthSensorEntityDescription and a tuple of sensor
descriptions for integration status, source IP, base URL and addon
response. Instantiate one HealthDiagnosticSensor per description in
async_setup_entry. Update HealthDiagnosticSensor to accept a
description, derive unique_id from description.key and add a cached
device_info returning a SERVICE-type device. Adjust imports.
2026-03-02 22:08:40 +01:00
SchiZzA 6a4eed2ff9
Validate hass data with py_typecheck.checked
Replace manual isinstance checks and casts with py_typecheck.checked()
to validate hass and entry data and return early on errors. Simplify
add_new_sensors by unwrapping values, renaming vars, and passing the
coordinator to WeatherSensor
2026-03-02 22:08:01 +01:00
SchiZzA b3aae77132
Replace casts with checked type helpers
Use checked and checked_or to validate option and hass.data types,
remove unsafe typing.cast calls, simplify coordinator and entry_data
handling, and cache boolean option flags for Windy and Pocasí checks
2026-03-02 22:06:09 +01:00
SchiZzA 7d1494f29b
Removed extended debugging info from sensors. Added descriptive method on route info. 2026-03-01 17:32:36 +01:00
SchiZzA 01058a07b4
Add WSLink support for additional sensor channels
- Extend constants and WSLink key remapping for channels 3–8 (temp, humidity, battery and connection)
- Add new WSLink sensor entity descriptions for the extra channel readings
- Update English translations for the newly added channel sensors and battery states
2026-03-01 16:56:46 +01:00
SchiZzA f06f8b31ae
Align Windy resend with Stations API response handling
- Add WINDY_MAX_RETRIES constant and use it consistently when deciding to disable resending
- Refactor Windy response verification to rely on HTTP status codes per stations.windy.com API
- Improve error handling for missing password, duplicate payloads and rate limiting
- Enhance retry logging and disable Windy resend via persistent notification on repeated failures
2026-03-01 13:51:17 +01:00
SchiZzA 95663fd78b
Stabilize webhook routing and config updates
- Register aiohttp webhook routes once and switch the active dispatcher handler on option changes
- Make the internal route registry method-aware (GET/POST) and improve enabled-route logging
- Fix OptionsFlow initialization by passing the config entry and using safe defaults for credentials
- Harden Windy resend by validating credentials early, auto-disabling the feature on invalid responses, and notifying the user
- Update translations for Windy credential validation errors
2026-03-01 12:48:23 +01:00
SchiZzA cc1afaa218
Organize imports in `__init__.py` 2026-02-26 17:59:25 +01:00
SchiZzA 9255820a13
Added support for GET and POST endpoints. 2026-02-26 17:58:12 +01:00
SchiZzA 1bbeab1ffe
Add health diagnostic sensor and extensive tests for sws12500
integration

- Introduce HealthDiagnosticSensor for device health status reporting
- Add new constants and data keys for health sensor integration
- Wire health_sensor module into sensor platform setup
- Refactor sensor descriptions to improve derived sensor handling
- Implement pytest fixtures and comprehensive tests covering:
  - Config flows and options validation
  - Data reception and authentication
  - Sensor platform setup and dynamic sensor addition
  - Push integration with Pocasi.cz and Windy API
  - Route dispatching and error handling
  - Utilities, conversions, and translation functions
2026-02-21 11:29:40 +01:00
SchiZzA 214b8581b0
Remove numpy dependency and simplify range checks in heat_index function 2026-02-06 18:32:56 +01:00
SchiZzA c84e112073
Fix typo in fallback handler name from unregistred to unregistered 2026-02-06 18:22:31 +01:00
SchiZzA fa3b5aa523
Anonymize data payload in PocasiPush logging 2026-02-06 18:20:17 +01:00
SchiZzA f608ab4991
Remove verify_ssl=False from async_get_clientsession calls 2026-02-06 18:14:02 +01:00
SchiZzA 0c67b8d2ab
Update Windy integration to use station ID and password authentication
- Replace API key with station ID and password for authentication
- Change Windy API endpoint to v2 observation update
- Adapt data conversion for WSLink to Windy format
- Update config flow and translations accordingly
2026-02-06 15:16:05 +01:00
SchiZzA 176420aa43
Update README.md 2026-02-03 18:03:08 +01:00
schizza c8466fdaa5
Cherry pick README.md 2026-02-03 17:53:13 +01:00
SchiZzA 1fec8313d4
Refactor SWS12500 integration for push-based updates
- Add detailed architecture overview in __init__.py
- Introduce shared runtime keys in data.py to avoid key conflicts
- Implement dual route dispatcher for webhook endpoint selection
- Enhance sensor platform for dynamic sensor addition without reloads
- Rename battery level "unknown" to "drained" with updated icons
- Improve utils.py with centralized helpers for remapping and discovery
- Update translations and strings for consistency
2026-01-27 09:21:27 +01:00
SchiZzA a3dc3d0d53
Improve data validation and error logging in utils.py 2026-01-18 21:48:24 +01:00
SchiZzA 234840e115
Rename battery_level_to_text to battery_level and update docstring 2026-01-18 19:36:33 +01:00
SchiZzA a20369bab3
Fix logging of unregistered route to include path 2026-01-18 19:35:51 +01:00
SchiZzA 08b812e558
Improve type safety, add data validation, and refactor route handling
- Add typing and runtime checks with py_typecheck for config options and
  incoming data
- Enhance authentication validation and error handling in data
  coordinator
- Refactor Routes class with better dispatch logic and route
  enabling/disabling
- Clean up async functions and improve logging consistency
- Add type hints and annotations throughout the integration
- Update manifest to include typecheck-runtime dependency
2026-01-18 17:53:28 +01:00
SchiZzA 39cd852b36
Merge remote-tracking branch 'origin/stable' into ecowitt_support 2025-12-22 16:43:04 +01:00
Lukas Svoboda 94ec3cb0e5
Merge pull request #86 from schizza/fix/index_computing
Fix Heat Index and Chill Index
2025-12-22 11:47:42 +01:00
SchiZzA 743e62c6dd
Fix Heat Index and Chill Index
Fixed computing Heat and Chill index in cases that outside senosor
battery is drained and station stop sending data for outside temp,
humidity and wind.
2025-12-22 11:47:02 +01:00
SchiZzA 466d41f1bb
Config_flow for Ecowitt 2025-12-09 12:01:32 +01:00
SchiZzA e34f73a467
Refactor config_flow, add support for Ecowitt configuration 2025-12-07 17:07:11 +01:00
Lukas Svoboda 6edaec73d8
Merge pull request #83 from schizza/recactor/fixes
Recactor/fixes
2025-11-17 00:29:50 +01:00
Lukas Svoboda 44d0ee5c7b
Merge branch 'stable' into recactor/fixes 2025-11-17 00:29:40 +01:00
SchiZzA e482fcea2b
Fix typecheck issues 2025-11-17 00:28:37 +01:00
Lukas Svoboda 826a9a71cc
Merge pull request #82 from schizza/readme_update
Update README
2025-11-16 22:40:14 +01:00
SchiZzA 08de8b5570
Update README 2025-11-16 22:36:05 +01:00
SchiZzA 0679f1e559
Fix typos, fix await in windy_func 2025-11-16 19:18:29 +01:00
Lukas Svoboda 0c42c8d827
Merge pull request #81 from schizza/feature/pocasi_cz
Support for resending data to Pocasi Meteo CZ
2025-11-16 19:09:53 +01:00
SchiZzA de346ed914
Fix: Retain dat for other options, while configuring Pocasi CZ 2025-11-16 19:02:24 +01:00
SchiZzA 7950e1be46
Add Pocasi CZ push to server support
Added `pocasi_cz.py` component to handle resending data to Pocasi CZ
server.
2025-11-16 18:37:01 +01:00
SchiZzA 0d47e14834
Add Počasí Meteo CZ integration to options flow and constants
Introduce new config step for Počasí Meteo CZ with validation and UI
schema. Define related constants and translation keys for setting up
data forwarding.
2025-11-16 15:37:13 +01:00
SchiZzA 92eadbb4e2
Typo update 2025-11-16 14:09:16 +01:00
SchiZzA 040f70c027
Adds translation keys.
Updated cs.json, en.json, strings,json
2025-11-16 13:38:11 +01:00
SchiZzA 67d8acf9cc
Add constants for `pocasimeteo_cz`
const `POCASI_CZ_URL` added to constants. cont `POCASI_CZ_SEDN_INTERVAL`
and POCASI_CZ_SEND_MINIMUM are added to contants.
2025-11-16 13:13:41 +01:00
Lukas Svoboda 006376fc49
Merge pull request #80 from schizza/fix/wind_dir
Corrects wind direction sensor state class
2025-10-27 08:13:15 +01:00
schizza a43d8202dd
Corrects wind direction sensor state class
Updates the wind direction sensor to use `MEASUREMENT_ANGLE` state class for proper representation in Home Assistant.

This ensures correct interpretation of wind direction as an angle.
2025-10-27 08:12:26 +01:00
Lukas Svoboda 4ccc37951d
Bump version from 1.6.8 to 1.6.9 2025-10-26 22:00:39 +01:00
Lukas Svoboda 688bb9e374
Merge pull request #79 from schizza/chore/assets
Maintenance, adds workflow for ZIP assets in releases.
Updated README to view downloaded counts.
2025-10-26 21:59:35 +01:00
schizza b3032d072f
Adds assets for HACS integration
Adds download badges to the README for better visibility
and includes zip release and filename options in the HACS
manifest to enable direct downloads through HACS.
2025-10-26 21:53:58 +01:00
schizza a2bc74c2ad
Adds workflow to publish ZIP asset on release
Creates a GitHub Actions workflow that automatically builds a ZIP archive of the custom component and attaches it to new releases.
This simplifies distribution and installation for users.
2025-10-26 21:49:22 +01:00
Lukas Svoboda 59116a6c48
Merge pull request #78 from schizza/77-wbgt-temperature-sensor-wslink-api-parameter-t1wbgt
Adds WBGT temperature sensor via WSLink
2025-10-26 21:40:36 +01:00
schizza 06a8a7ff1b
Adds WBGT temperature sensor via WSLink
Adds the Wet Bulb Globe Temperature (WBGT) sensor to the integration, pulling data via the WSLink API.

Corrects the state class for rain and wind direction.
2025-10-26 21:39:31 +01:00
Lukas Svoboda 1d8928bf12
Merge pull request #76 from schizza/74-additional-sensors
Release 1.6.7
2025-10-26 21:03:23 +01:00
schizza 397005bd3f
Updates component version to 1.6.7
Bump version

This reflects changes introduced in the '74-additional-sensors'
branch.
2025-10-26 21:02:13 +01:00
schizza 80909e88c0
Adds indoor and channel 2 battery sensors
Adds support for displaying indoor console and channel 2 battery levels as sensors.

Updates sensor logic to use a common list to determine icon representation

Fixes #74
2025-10-26 20:59:57 +01:00
Lukas Svoboda 5022cb7767
Merge pull request #65 from convicte/patch-1
Implement SensorDeviceClass.WIND_DIRECTION
2025-08-28 21:58:28 +02:00
Lukas Svoboda f7cea43722
Merge branch 'stable' into patch-1 2025-08-28 21:58:15 +02:00
Lukas Svoboda 7ff8bb7f92
Merge branch 'main' into patch-1 2025-08-28 21:57:38 +02:00
Lukas Svoboda dbebc501e3
Bump version from 1.6.5 to 1.6.6 2025-08-28 21:55:01 +02:00
Lukas Svoboda 8247f2b854
Merge pull request #72 from FerronN/ft-add-wslink-battery-level
Add outside battery sensor and related translations
2025-08-28 21:53:27 +02:00
Ferron Nijland d48f0fda6e Merge branch 'ft-add-wslink-battery-level' of https://github.com/FerronN/SWS-12500-custom-component into ft-add-wslink-battery-level 2025-08-28 10:42:03 +02:00
schizza 99fd6d266c Adds outside battery sensor
Adds an outside battery sensor to the integration, providing information about the battery level of the outdoor sensor.

This includes:
- Mapping the `t1bat` WSLink item to the `OUTSIDE_BATTERY` sensor.
- Implementing logic to convert the battery level to a human-readable text representation and a corresponding icon.
- Updates precipitation to intensity and fixes data type of battery level
2025-08-28 10:41:44 +02:00
schizza 64dd47a3e9 Option flow configuration
Removes the "migration" step from the option flow menu.
This step will be used in next release.
2025-08-28 10:41:44 +02:00
schizza 720c2148e6 Removes unused import from config_flow
Removes the unused import of `utils` to improve code cleanliness and avoid potential namespace conflicts.
Removed 'migration' from menu as it is intended to use in later version.
2025-08-28 10:41:44 +02:00
schizza b858f648b9 config_flow migrated to stable version.
Config flow was migrated to stable version.

Removes the unit migration flow, which is intended to introduce later.
2025-08-28 10:41:44 +02:00
schizza 07ca4a6833 Reorders constants for better readability
Reorders the import of constants to improve readability and maintain consistency within the module.

Final touches.
2025-08-28 10:41:44 +02:00
schizza de013891c0 Adds missing constant and improves readability
Adds OUTSIDE_BATTERY to the DISABLED_BY_DEFAULT list.
Improves readability by formatting long strings with parenthesis.
2025-08-28 10:41:44 +02:00
schizza 0d0922a494 Improves battery sensor display
Updates the outside battery sensor to display an icon
reflecting the battery level, enhancing the user experience
by providing a visual indication of the battery status.
2025-08-28 10:41:44 +02:00
schizza af19358ac7 Adds battery state translations
Adds translations for the battery state of the outside sensor to both the English and Czech language files.

This change provides more descriptive and user-friendly information about the battery status.
2025-08-28 10:41:44 +02:00
schizza 3dbf8b8a7a Improves battery level representation
Refactors battery level representation by using enum instead of string.

Improves battery level display by adding an icon representation.

Changes const BATLEVEL to BATTERY_LEVEL.
2025-08-28 10:41:44 +02:00
schizza a68a4c929a Update const.py 2025-08-28 10:41:44 +02:00
FerronN af286648e9 fix data parsing in sensors_wslink.py 2025-08-28 10:41:44 +02:00
FerronN b6080fe9fd Fix structure en.json 2025-08-28 10:41:44 +02:00
Ferron Nijland a07af5a4fd Add outside battery sensor and related translations 2025-08-28 10:41:44 +02:00
schizza f14e6500d4 Adds unit migration functionality to options flow
Implements a user interface to migrate units for rain sensors including migration of historic data via statistics.
This provides the user with the ability to correct rain units, if they have been set incorrectly.
Includes UI to select sensor and units, as well as trigger migration.
2025-08-28 10:40:03 +02:00
Lukas Svoboda a1f2bf10ea
Merge branch 'stable' into ft-add-wslink-battery-level 2025-08-22 18:15:11 +02:00
schizza e10ea9901c
Adds outside battery sensor
Adds an outside battery sensor to the integration, providing information about the battery level of the outdoor sensor.

This includes:
- Mapping the `t1bat` WSLink item to the `OUTSIDE_BATTERY` sensor.
- Implementing logic to convert the battery level to a human-readable text representation and a corresponding icon.
- Updates precipitation to intensity and fixes data type of battery level
2025-08-22 18:06:35 +02:00
schizza fc8349c06e
Option flow configuration
Removes the "migration" step from the option flow menu.
This step will be used in next release.
2025-08-21 16:46:13 +02:00
schizza d4d2440ae8
Removes unused import from config_flow
Removes the unused import of `utils` to improve code cleanliness and avoid potential namespace conflicts.
Removed 'migration' from menu as it is intended to use in later version.
2025-08-21 16:45:29 +02:00
schizza 827fb71e25
sensors_wslink updated to stable version
Updating to stable version, retaining CH3 sensors.
Left outside battery unchanged. Will work on bug in next commit.
2025-08-21 16:39:05 +02:00
schizza 2d758835dc
config_flow migrated to stable version.
Config flow was migrated to stable version.

Removes the unit migration flow, which is intended to introduce later.
2025-08-21 16:30:10 +02:00
schizza 0027a80968
Update const to stable version
Update constants to stable version.
2025-08-21 16:08:08 +02:00
schizza e11e068c0f Reorders constants for better readability
Reorders the import of constants to improve readability and maintain consistency within the module.

Final touches.
2025-08-18 13:29:00 +02:00
schizza 1ecd88269d Adds missing constant and improves readability
Adds OUTSIDE_BATTERY to the DISABLED_BY_DEFAULT list.
Improves readability by formatting long strings with parenthesis.
2025-08-18 12:53:40 +02:00
schizza 09d79e2032 Improves battery sensor display
Updates the outside battery sensor to display an icon
reflecting the battery level, enhancing the user experience
by providing a visual indication of the battery status.
2025-08-18 12:50:31 +02:00
schizza bbe31da4c5 Adds battery state translations
Adds translations for the battery state of the outside sensor to both the English and Czech language files.

This change provides more descriptive and user-friendly information about the battery status.
2025-08-18 10:55:37 +02:00
schizza 68da7aad98 Improves battery level representation
Refactors battery level representation by using enum instead of string.

Improves battery level display by adding an icon representation.

Changes const BATLEVEL to BATTERY_LEVEL.
2025-08-17 18:34:14 +02:00
schizza de8d2a7b0c
Update const.py 2025-08-16 17:29:22 +02:00
FerronN cf0938a6fd
fix data parsing in sensors_wslink.py 2025-07-11 10:16:59 +02:00
FerronN 4d2dedbb11
Fix structure en.json 2025-07-11 10:15:44 +02:00
Ferron Nijland feed730818 Add outside battery sensor and related translations 2025-07-09 16:26:18 +02:00
schizza b1cec2f38f
Adds CH3 temperature and humidity sensors
Enables CH3 temperature and humidity sensors for WSLink devices.
2025-07-04 19:27:45 +02:00
convicte 6eceee1f4e
Implement SensorDeviceClass.WIND_DIRECTION
This commit adds the latest SensorDeviceClass.WIND_DIRECTION to facilitate correct long term statistics collection for the wind direction sensor.
2025-03-15 23:13:21 +01:00
47 changed files with 6961 additions and 918 deletions

21
.github/workflows/publish-assets.yml vendored Normal file
View File

@ -0,0 +1,21 @@
name: Build & Attach ZIP asset
on:
release:
types: [published]
jobs:
build-zip:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Create ZIP
run: |
mkdir -p dist
cd custom_components/sws12500
zip -r ../../dist/weather-station.zip . -x "*/__pycache__/*"
- name: Upload ZIP to release
uses: softprops/action-gh-release@v2
with:
files: dist/weather-station.zip

View File

@ -1,31 +1,46 @@
# Integrates your Sencor SWS 12500 or 16600, GARNI, BRESSER weather stations seamlessly into Home Assistant
![GitHub Downloads](https://img.shields.io/github/downloads/schizza/SWS-12500-custom-component/total?label=downloads%20%28all%20releases%29)
![Latest release downloads](https://img.shields.io/github/downloads/schizza/SWS-12500-custom-component/latest/total?label=downloads%20%28latest%29)
This integration will listen for data from your station and passes them to respective sensors. It also provides the ability to push data to Windy API.
# Integrates your Sencor SWS 12500, SWS16600, SWS 10500, GARNI, BRESSER weather stations seamlessly into Home Assistant
_This custom component replaces [old integration via Node-RED and proxy server](https://github.com/schizza/WeatherStation-SWS12500)._
This integration will listen for data from your station and passes them to respective sensors. It also provides the ability to push data to `Windy API` or `Pocasi Meteo`.
### In the next major release, I there will be support for Ecowitt stations as well
---
### In the next major release, I plan to rename the integration, as its current name no longer reflects its original purpose. The integration was initially developed primarily for the SWS12500 station, but it already supports other weather stations as well (e.g., Bresser, Garni, and others). Support for Ecowitt stations will also be added in the future, so the current name has become misleading. This information will be provided via an update, and Im also planning to offer a full data migration from the existing integration to the new one, so will not lose any of historical data.
- The transition date hasnt been set yet, but its currently expected to happen within the next ~23 months. At the moment, Im working on a full refactor and general code cleanup. Looking further ahead, the goal is to have the integration fully incorporated into Home Assistant as a native component—meaning it wont need to be installed via HACS, but will become part of the official Home Assistant distribution.
- Im also looking for someone who owns an Ecowitt weather station and would be willing to help with testing the integration for these devices.
---
## Warning - WSLink APP (applies also for SWS 12500 with firmware >3.0)
For stations that are using WSLink app to setup station and WSLink API for resending data (SWS 12500 manufactured in 2024 and later). You will need to install [WSLink SSL proxy addon](https://github.com/schizza/wslink-addon) to your Home Assistant if you are not running your Home Assistant instance in SSL mode or you do not have SSL proxy for your Home Assistant.
For stations that are using WSLink app to setup station and WSLink API for resending data (also SWS 12500 manufactured in 2024 and later). You will need to install [WSLink SSL proxy addon](https://github.com/schizza/wslink-addon) to your Home Assistant if you are not running your Home Assistant instance in SSL mode or you do not have SSL proxy for your Home Assistant.
## Requirements
- Weather station that supports sending data to custom server in their API [(list of supported stations.)](#list-of-supported-stations)
- Configure station to send data directly to Home Assistant.
- If you want to push data to Windy, you have to create an account at [Windy](https://stations.windy.com).
- If you want to resend data to `Pocasi Meteo`, you have to create accout at [Pocasi Meteo](https://pocasimeteo.cz)
## List of supported stations
## Example of supported stations
- [Sencor SWS 12500 Weather Station](https://www.sencor.cz/profesionalni-meteorologicka-stanice/sws-12500)
- [Sencor SWS 16600 WiFi SH](https://www.sencor.cz/meteorologicka-stanice/sws-16600)
- SWS 10500 (newer releases are also supported with [WSLink SSL proxy addon](https://github.com/schizza/wslink-addon))
- Bresser stations that support custom server upload. [for example, this is known to work](https://www.bresser.com/p/bresser-wi-fi-clearview-weather-station-with-7-in-1-sensor-7002586)
- Garni stations with WSLink support or custom server support.
- and bunch of other models are that is not listed here are supported
## Installation
### If your SWS12500 station's firmware is 1.0 or your station is configured as described in this README and you still can not see any data incoming to Home Assistant please [read here](https://github.com/schizza/SWS-12500-custom-component/issues/17) and [here](firmware_bug.md)
### For stations that send through WSLink API
### For stations that send data through WSLink API
Make sure you have your Home Assistant cofigured in SSL mode or use [WSLink SSL proxy addon](https://github.com/schizza/wslink-addon) to bypass SSL configuration of whole Home Assistant.
@ -83,6 +98,13 @@ If you change `API ID` or `API KEY` in the station, you have to reconfigure inte
As soon as the integration is added into Home Assistant it will listen for incoming data from the station and starts to fill sensors as soon as data will first arrive.
## Upgrading from PWS to WSLink
If you upgrade your station, that was previously sending data in PWS protocol, to station with WSLink protocol, you have to remove the integration a reinstall it. WSLink protocol is using metric scale instead of imperial used in PWS protocol.
So, deleteing integration and reinstalling will make sure, that sensors will be avare of change of the measurement scale.
- as sensors unique IDs are the same, you will not loose any of historical data
## Resending data to Windy API
- First of all you need to create account at [Windy stations](https://stations.windy.com).
@ -99,6 +121,16 @@ As soon as the integration is added into Home Assistant it will listen for incom
- You are done.
## Resending data to Pocasi Meteo
- If you are willing to use [Pocasi Meteo Application](https://pocasimeteo.cz) you can enable resending your data to their servers
- You must have account at Pocasi Meteo, where you will recieve `ID` and `KEY`, which are needed to connect to server
- In `Settings` -> `Devices & services` find SWS12500 and click `Configure`.
- In dialog box choose `Pocasi Meteo configuration`.
- Fill in `ID` and `KEY` you were provided at `Pocasi Meteo`.
- Tick `Enable` checkbox.
- You are done.
## WSLink notes
While your station is using WSLink you have to have Home Assistant in SSL mode or behind SSL proxy server.
@ -117,4 +149,4 @@ you will set URL in station to: 192.0.0.2:4443
- Your station will be sending data to this SSL proxy and addon will handle the rest.
_Most of the stations does not care about self-signed certificates on the server side._
_Most of the stations does not care about self-signed certificates on the server side._

View File

@ -1,28 +1,60 @@
"""The Sencor SWS 12500 Weather Station integration."""
"""Sencor SWS 12500 Weather Station integration (push/webhook based).
Architecture overview
---------------------
This integration is *push-based*: the weather station calls our HTTP endpoint and we
receive a query payload. We do not poll the station.
Key building blocks:
- `WeatherDataUpdateCoordinator` acts as an in-memory "data bus" for the latest payload.
On each webhook request we call `async_set_updated_data(...)` and all `CoordinatorEntity`
sensors get notified and update their states.
- `hass.data[DOMAIN][entry_id]` is a per-entry *dict* that stores runtime state
(coordinator instance, options snapshot, and sensor platform callbacks). Keeping this
structure consistent is critical; mixing different value types under the same key can
break listener wiring and make the UI appear "frozen".
Auto-discovery
--------------
When the station starts sending a new field, we:
1) persist the new sensor key into options (`SENSORS_TO_LOAD`)
2) dynamically add the new entity through the sensor platform (without reloading)
Why avoid reload?
Reloading a config entry unloads platforms temporarily, which removes coordinator listeners.
With a high-frequency push source (webhook), a reload at the wrong moment can lead to a
period where no entities are subscribed, causing stale states until another full reload/restart.
"""
import logging
from typing import Any
import aiohttp
import aiohttp.web
from aiohttp.web_exceptions import HTTPUnauthorized
from py_typecheck import checked, checked_or
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import InvalidStateError, PlatformNotReady
from homeassistant.exceptions import ConfigEntryNotReady, InvalidStateError, PlatformNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import (
API_ID,
API_KEY,
DEFAULT_URL,
DEV_DBG,
DOMAIN,
HEALTH_URL,
POCASI_CZ_ENABLED,
SENSORS_TO_LOAD,
WINDY_ENABLED,
WSLINK,
WSLINK_URL,
)
from .routes import Routes, unregistred
from .data import ENTRY_COORDINATOR, ENTRY_HEALTH_COORD, ENTRY_LAST_OPTIONS
from .health_coordinator import HealthCoordinator
from .pocasti_cz import PocasiPush
from .routes import Routes
from .utils import (
anonymize,
check_disabled,
@ -43,66 +75,173 @@ class IncorrectDataError(InvalidStateError):
"""Invalid exception."""
# NOTE:
# We intentionally avoid importing the sensor platform module at import-time here.
# Home Assistant can import modules in different orders; keeping imports acyclic
# prevents "partially initialized module" failures (circular imports / partially initialized modules).
#
# When we need to dynamically add sensors, we do a local import inside the webhook handler.
class WeatherDataUpdateCoordinator(DataUpdateCoordinator):
"""Manage fetched data."""
"""Coordinator for push updates.
Even though Home Assistant's `DataUpdateCoordinator` is often used for polling,
it also works well as a "fan-out" mechanism for push integrations:
- webhook handler updates `self.data` via `async_set_updated_data`
- all `CoordinatorEntity` instances subscribed to this coordinator update themselves
"""
def __init__(self, hass: HomeAssistant, config: ConfigEntry) -> None:
"""Init global updater."""
self.hass = hass
self.config = config
self.windy = WindyPush(hass, config)
"""Initialize the coordinator.
`config` is the config entry for this integration instance. We store it because
the webhook handler needs access to options (auth data, enabled features, etc.).
"""
self.hass: HomeAssistant = hass
self.config: ConfigEntry = config
self.windy: WindyPush = WindyPush(hass, config)
self.pocasi: PocasiPush = PocasiPush(hass, config)
super().__init__(hass, _LOGGER, name=DOMAIN)
async def recieved_data(self, webdata):
"""Handle incoming data query."""
_wslink = self.config_entry.options.get(WSLINK)
data = webdata.query
def _health_coordinator(self) -> HealthCoordinator | None:
"""Return the health coordinator for this config entry."""
if (data := checked(self.hass.data.get(DOMAIN), dict[str, Any])) is None:
return None
if (entry := checked(data.get(self.config.entry_id), dict[str, Any])) is None:
return None
response = None
coordinator = entry.get(ENTRY_HEALTH_COORD)
return coordinator if isinstance(coordinator, HealthCoordinator) else None
async def received_data(self, webdata: aiohttp.web.Request) -> aiohttp.web.Response:
"""Handle incoming webhook payload from the station.
This method:
- validates authentication (different keys for WU vs WSLink)
- optionally forwards data to third-party services (Windy / Pocasi)
- remaps payload keys to internal sensor keys
- auto-discovers new sensor fields and adds entities dynamically
- updates coordinator data so existing entities refresh immediately
"""
# WSLink uses different auth and payload field naming than the legacy endpoint.
_wslink: bool = checked_or(self.config.options.get(WSLINK), bool, False)
# Incoming station payload is delivered as query params.
# Some stations posts data in body, so we need to contracts those data.
#
# We copy it to a plain dict so it can be passed around safely.
get_data = webdata.query
post_data = await webdata.post()
# normalize incoming data to dict[str, Any]
data: dict[str, Any] = {**dict(get_data), **dict(post_data)}
# Get health data coordinator
health = self._health_coordinator()
# Validate auth keys (different parameter names depending on endpoint mode).
if not _wslink and ("ID" not in data or "PASSWORD" not in data):
_LOGGER.error("Invalid request. No security data provided!")
if health:
health.update_ingress_result(
webdata,
accepted=False,
authorized=False,
reason="missing_credentials",
)
raise HTTPUnauthorized
if _wslink and ("wsid" not in data or "wspw" not in data):
_LOGGER.error("Invalid request. No security data provided!")
if health:
health.update_ingress_result(
webdata,
accepted=False,
authorized=False,
reason="missing_credentials",
)
raise HTTPUnauthorized
if _wslink:
id_data = data["wsid"]
key_data = data["wspw"]
else:
id_data = data["ID"]
key_data = data["PASSWORD"]
id_data: str = ""
key_data: str = ""
_id = self.config_entry.options.get(API_ID)
_key = self.config_entry.options.get(API_KEY)
if _wslink:
id_data = data.get("wsid", "")
key_data = data.get("wspw", "")
else:
id_data = data.get("ID", "")
key_data = data.get("PASSWORD", "")
# Validate credentials against the integration's configured options.
# If auth doesn't match, we reject the request (prevents random pushes from the LAN/Internet).
if (_id := checked(self.config.options.get(API_ID), str)) is None:
_LOGGER.error("We don't have API ID set! Update your config!")
if health:
health.update_ingress_result(
webdata,
accepted=False,
authorized=None,
reason="config_missing_api_id",
)
raise IncorrectDataError
if (_key := checked(self.config.options.get(API_KEY), str)) is None:
_LOGGER.error("We don't have API KEY set! Update your config!")
if health:
health.update_ingress_result(
webdata,
accepted=False,
authorized=None,
reason="config_missing_api_key",
)
raise IncorrectDataError
if id_data != _id or key_data != _key:
_LOGGER.error("Unauthorised access!")
if health:
health.update_ingress_result(
webdata,
accepted=False,
authorized=False,
reason="unauthorized",
)
raise HTTPUnauthorized
if self.config_entry.options.get(WINDY_ENABLED):
response = await self.windy.push_data_to_windy(data)
# Convert raw payload keys to our internal sensor keys (stable identifiers).
remaped_items: dict[str, str] = remap_wslink_items(data) if _wslink else remap_items(data)
remaped_items = (
remap_wslink_items(data)
if self.config_entry.options.get(WSLINK)
else remap_items(data)
)
if sensors := check_disabled(self.hass, remaped_items, self.config):
translate_sensors = [
await translations(
self.hass, DOMAIN, f"sensor.{t_key}", key="name", category="entity"
# Auto-discovery: if payload contains keys that are not enabled/loaded yet,
# add them to the option list and create entities dynamically.
if sensors := check_disabled(remaped_items, self.config):
if (
translate_sensors := checked(
[
await translations(
self.hass,
DOMAIN,
f"sensor.{t_key}",
key="name",
category="entity",
)
for t_key in sensors
if await translations(
self.hass,
DOMAIN,
f"sensor.{t_key}",
key="name",
category="entity",
)
is not None
],
list[str],
)
for t_key in sensors
if await translations(
self.hass, DOMAIN, f"sensor.{t_key}", key="name", category="entity"
)
is not None
]
human_readable = "\n".join(translate_sensors)
) is not None:
human_readable: str = "\n".join(translate_sensors)
else:
human_readable = ""
await translated_notification(
self.hass,
@ -110,117 +249,196 @@ class WeatherDataUpdateCoordinator(DataUpdateCoordinator):
"added",
{"added_sensors": f"{human_readable}\n"},
)
if _loaded_sensors := loaded_sensors(self.config_entry):
# Persist newly discovered sensor keys to options (so they remain enabled after restart).
newly_discovered = list(sensors)
if _loaded_sensors := loaded_sensors(self.config):
sensors.extend(_loaded_sensors)
await update_options(self.hass, self.config_entry, SENSORS_TO_LOAD, sensors)
# await self.hass.config_entries.async_reload(self.config.entry_id)
await update_options(self.hass, self.config, SENSORS_TO_LOAD, sensors)
# Dynamically add newly discovered sensors *without* reloading the entry.
#
# Why: Reloading a config entry unloads platforms temporarily. That removes coordinator
# listeners; with frequent webhook pushes the UI can appear "frozen" until the listeners
# are re-established. Dynamic adds avoid this window completely.
#
# We do a local import to avoid circular imports at module import time.
#
# NOTE: Some linters prefer top-level imports. In this case the local import is
# intentional and prevents "partially initialized module" errors.
from .sensor import add_new_sensors # noqa: PLC0415 (local import is intentional)
add_new_sensors(self.hass, self.config, newly_discovered)
# Fan-out update: notify all subscribed entities.
self.async_set_updated_data(remaped_items)
if health:
health.update_ingress_result(
webdata,
accepted=True,
authorized=True,
reason="accepted",
)
if self.config_entry.options.get(DEV_DBG):
# Optional forwarding to external services. This is kept here (in the webhook handler)
# to avoid additional background polling tasks.
_windy_enabled = checked_or(self.config.options.get(WINDY_ENABLED), bool, False)
_pocasi_enabled = checked_or(self.config.options.get(POCASI_CZ_ENABLED), bool, False)
if _windy_enabled:
await self.windy.push_data_to_windy(data, _wslink)
if _pocasi_enabled:
await self.pocasi.push_data_to_server(data, "WSLINK" if _wslink else "WU")
if health:
health.update_forwarding(self.windy, self.pocasi)
# Optional dev logging (keep it lightweight to avoid log spam under high-frequency updates).
if self.config.options.get("dev_debug_checkbox"):
_LOGGER.info("Dev log: %s", anonymize(data))
response = response or "OK"
return aiohttp.web.Response(body=f"{response or 'OK'}", status=200)
return aiohttp.web.Response(body="OK", status=200)
def register_path(
hass: HomeAssistant,
url_path: str,
coordinator: WeatherDataUpdateCoordinator,
coordinator_h: HealthCoordinator,
config: ConfigEntry,
):
"""Register path to handle incoming data."""
) -> bool:
"""Register webhook paths.
hass_data = hass.data.setdefault(DOMAIN, {})
debug = config.options.get(DEV_DBG)
_wslink = config.options.get(WSLINK)
We register both possible endpoints and use an internal dispatcher (`Routes`) to
enable exactly one of them. This lets us toggle WSLink mode without re-registering
routes on the aiohttp router.
"""
routes: Routes = hass_data.get("routes") if "routes" in hass_data else None
hass.data.setdefault(DOMAIN, {})
if (hass_data := checked(hass.data[DOMAIN], dict[str, Any])) is None:
raise ConfigEntryNotReady
if routes is None:
_wslink: bool = checked_or(config.options.get(WSLINK), bool, False)
# Load registred routes
routes: Routes | None = hass_data.get("routes", None)
if not isinstance(routes, Routes):
routes = Routes()
_LOGGER.info("Routes not found, creating new routes")
if debug:
_LOGGER.debug("Enabled route is: %s, WSLink is %s", url_path, _wslink)
routes.set_ingress_observer(coordinator_h.record_dispatch)
# Register webhooks in HomeAssistant with dispatcher
try:
default_route = hass.http.app.router.add_get(
DEFAULT_URL,
coordinator.recieved_data if not _wslink else unregistred,
name="weather_default_url",
)
if debug:
_LOGGER.debug("Default route: %s", default_route)
wslink_route = hass.http.app.router.add_get(
WSLINK_URL,
coordinator.recieved_data if _wslink else unregistred,
name="weather_wslink_url",
)
if debug:
_LOGGER.debug("WSLink route: %s", wslink_route)
routes.add_route(
DEFAULT_URL,
default_route,
coordinator.recieved_data if not _wslink else unregistred,
not _wslink,
)
routes.add_route(
WSLINK_URL, wslink_route, coordinator.recieved_data, _wslink
)
_default_route = hass.http.app.router.add_get(DEFAULT_URL, routes.dispatch, name="_default_route")
_wslink_post_route = hass.http.app.router.add_post(WSLINK_URL, routes.dispatch, name="_wslink_post_route")
_wslink_get_route = hass.http.app.router.add_get(WSLINK_URL, routes.dispatch, name="_wslink_get_route")
_health_route = hass.http.app.router.add_get(HEALTH_URL, routes.dispatch, name="_health_route")
# Save initialised routes
hass_data["routes"] = routes
except RuntimeError as Ex: # pylint: disable=(broad-except)
if (
"Added route will never be executed, method GET is already registered"
in Ex.args
):
_LOGGER.info("Handler to URL (%s) already registred", url_path)
return False
except RuntimeError as Ex:
_LOGGER.critical("Routes cannot be added. Integration will not work as expected. %s", Ex)
raise ConfigEntryNotReady from Ex
_LOGGER.error("Unable to register URL handler! (%s)", Ex.args)
return False
_LOGGER.info(
"Registered path to handle weather data: %s",
routes.get_enabled(), # pylint: disable=used-before-assignment
# Finally create internal route dispatcher with provided urls, while we have webhooks registered.
routes.add_route(DEFAULT_URL, _default_route, coordinator.received_data, enabled=not _wslink)
routes.add_route(WSLINK_URL, _wslink_post_route, coordinator.received_data, enabled=_wslink)
routes.add_route(WSLINK_URL, _wslink_get_route, coordinator.received_data, enabled=_wslink)
# Make health route `sticky` so it will not change upon updating options.
routes.add_route(
HEALTH_URL,
_health_route,
coordinator_h.health_status,
enabled=True,
sticky=True,
)
if _wslink:
routes.switch_route(coordinator.recieved_data, WSLINK_URL)
else:
routes.switch_route(coordinator.recieved_data, DEFAULT_URL)
return routes
routes.set_ingress_observer(coordinator_h.record_dispatch)
_LOGGER.info("We have already registered routes: %s", routes.show_enabled())
return True
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up the config entry for my device."""
"""Set up a config entry.
coordinator = WeatherDataUpdateCoordinator(hass, entry)
Important:
- We store per-entry runtime state under `hass.data[DOMAIN][entry_id]` as a dict.
- We reuse the same coordinator instance across reloads so that:
- the webhook handler keeps updating the same coordinator
- already-created entities remain subscribed
"""
hass_data = hass.data.setdefault(DOMAIN, {})
hass_data[entry.entry_id] = coordinator
# hass_data = cast("dict[str, Any]", hass_data_any)
_wslink = entry.options.get(WSLINK)
debug = entry.options.get(DEV_DBG)
# Per-entry runtime storage:
# hass.data[DOMAIN][entry_id] is always a dict (never the coordinator itself).
# Mixing types here (sometimes dict, sometimes coordinator) is a common source of hard-to-debug
# issues where entities stop receiving updates.
if debug:
_LOGGER.debug("WS Link is %s", "enbled" if _wslink else "disabled")
if (entry_data := checked(hass_data.get(entry.entry_id), dict[str, Any])) is None:
entry_data = {}
hass_data[entry.entry_id] = entry_data
route = register_path(
hass, DEFAULT_URL if not _wslink else WSLINK_URL, coordinator, entry
)
# Reuse the existing coordinator across reloads so webhook handlers and entities
# remain connected to the same coordinator instance.
#
# Note: Routes store a bound method (`coordinator.received_data`). If we replaced the coordinator
# instance on reload, the dispatcher could keep calling the old instance while entities listen
# to the new one, causing updates to "disappear".
coordinator = entry_data.get(ENTRY_COORDINATOR)
if isinstance(coordinator, WeatherDataUpdateCoordinator):
coordinator.config = entry
if not route:
_LOGGER.error("Fatal: path not registered!")
raise PlatformNotReady
# Recreate helper instances so they pick up updated options safely.
coordinator.windy = WindyPush(hass, entry)
coordinator.pocasi = PocasiPush(hass, entry)
else:
coordinator = WeatherDataUpdateCoordinator(hass, entry)
entry_data[ENTRY_COORDINATOR] = coordinator
hass_data["route"] = route
# Similar to the coordinator, we want to reuse the same health coordinator instance across
# reloads so that the health endpoint remains responsive and doesn't lose its listeners.
coordinator_health = entry_data.get(ENTRY_HEALTH_COORD)
if isinstance(coordinator_health, HealthCoordinator):
coordinator_health.config = entry
else:
coordinator_health = HealthCoordinator(hass, entry)
entry_data[ENTRY_HEALTH_COORD] = coordinator_health
routes: Routes | None = hass_data.get("routes", None)
# Keep an options snapshot so update_listener can skip reloads when only `SENSORS_TO_LOAD` changes.
# Auto-discovery updates this option frequently and we do not want to reload for that case.
entry_data[ENTRY_LAST_OPTIONS] = dict(entry.options)
_wslink = checked_or(entry.options.get(WSLINK), bool, False)
_LOGGER.debug("WS Link is %s", "enbled" if _wslink else "disabled")
if routes:
_LOGGER.debug("We have routes registered, will try to switch dispatcher.")
routes.switch_route(coordinator.received_data, DEFAULT_URL if not _wslink else WSLINK_URL)
routes.set_ingress_observer(coordinator_health.record_dispatch)
coordinator_health.update_routing(routes)
_LOGGER.debug("%s", routes.show_enabled())
else:
routes_enabled = register_path(hass, coordinator, coordinator_health, entry)
if not routes_enabled:
_LOGGER.error("Fatal: path not registered!")
raise PlatformNotReady
routes = hass_data.get("routes", None)
if isinstance(routes, Routes):
coordinator_health.update_routing(routes)
await coordinator_health.async_config_entry_first_refresh()
coordinator_health.update_forwarding(coordinator.windy, coordinator.pocasi)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
@ -230,10 +448,38 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def update_listener(hass: HomeAssistant, entry: ConfigEntry):
"""Update setup listener."""
"""Handle config entry option updates.
await hass.config_entries.async_reload(entry.entry_id)
We skip reloading when only `SENSORS_TO_LOAD` changes.
Why:
- Auto-discovery updates `SENSORS_TO_LOAD` as new payload fields appear.
- Reloading a push-based integration temporarily unloads platforms and removes
coordinator listeners, which can make the UI appear "stuck" until restart.
"""
if (hass_data := checked(hass.data.get(DOMAIN), dict[str, Any])) is not None:
if (entry_data := checked(hass_data.get(entry.entry_id), dict[str, Any])) is not None:
if (old_options := checked(entry_data.get(ENTRY_LAST_OPTIONS), dict[str, Any])) is not None:
new_options = dict(entry.options)
changed_keys = {
k
for k in set(old_options.keys()) | set(new_options.keys())
if old_options.get(k) != new_options.get(k)
}
# Update snapshot early for the next comparison.
entry_data[ENTRY_LAST_OPTIONS] = new_options
if changed_keys == {SENSORS_TO_LOAD}:
_LOGGER.debug("Options updated (%s); skipping reload.", SENSORS_TO_LOAD)
return
else:
# No/invalid snapshot: store current options for next comparison.
entry_data[ENTRY_LAST_OPTIONS] = dict(entry.options)
_ = await hass.config_entries.async_reload(entry.entry_id)
_LOGGER.info("Settings updated")

View File

@ -0,0 +1,53 @@
"""Battery binary sensor entities."""
from __future__ import annotations
from typing import Any
from py_typecheck import checked_or
from homeassistant.components.binary_sensor import BinarySensorEntity, BinarySensorEntityDescription
from homeassistant.helpers.update_coordinator import CoordinatorEntity
class BatteryBinarySensor( # pyright: ignore[reportIncompatibleVariableOverride]
CoordinatorEntity, BinarySensorEntity
):
"""Represent a low-battery binary sensor.
Station payload uses:
- ``0`` => low battery (binary sensor is ``on``)
- ``1`` => battery OK (binary sensor is ``off``)
"""
_attr_has_entity_name = True
_attr_should_poll = False
def __init__(
self,
coordinator: Any,
description: BinarySensorEntityDescription,
) -> None:
"""Initialize the battery binary sensor."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{description.key}_battery"
@property
def is_on(self) -> bool | None: # pyright: ignore[reportIncompatibleVariableOverride]
"""Return low-battery state.
``True`` means low battery for ``BinarySensorDeviceClass.BATTERY``.
"""
data = checked_or(self.coordinator.data, dict[str, Any], {})
raw: Any = data.get(self.entity_description.key)
if raw is None or raw == "":
return None
try:
value = int(raw)
except (TypeError, ValueError):
return None
return value == 0

View File

@ -0,0 +1,24 @@
"""Battery sensors."""
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
BinarySensorEntityDescription,
)
BATTERY_BINARY_SENSORS: tuple[BinarySensorEntityDescription, ...] = (
BinarySensorEntityDescription(
key="outside_battery",
translation_key="outside_battery",
device_class=BinarySensorDeviceClass.BATTERY,
),
BinarySensorEntityDescription(
key="indoor_battery",
translation_key="indoor_battery",
device_class=BinarySensorDeviceClass.BATTERY,
),
BinarySensorEntityDescription(
key="ch2_battery",
translation_key="ch2_battery",
device_class=BinarySensorDeviceClass.BATTERY,
),
)

View File

@ -1,34 +1,38 @@
"""Config flow for Sencor SWS 12500 Weather Station integration."""
import logging
import secrets
from typing import Any
import voluptuous as vol
from yarl import URL
from homeassistant.config_entries import ConfigFlow, OptionsFlow
from homeassistant.const import UnitOfPrecipitationDepth, UnitOfVolumetricFlux
from homeassistant.config_entries import ConfigEntry, ConfigFlow, ConfigFlowResult, OptionsFlow
from homeassistant.core import callback
from homeassistant.exceptions import HomeAssistantError
import homeassistant.helpers.entity_registry as er
from homeassistant.helpers.network import get_url
from .const import (
API_ID,
API_KEY,
DEV_DBG,
DOMAIN,
ECOWITT_ENABLED,
ECOWITT_WEBHOOK_ID,
INVALID_CREDENTIALS,
MIG_FROM,
MIG_TO,
SENSOR_TO_MIGRATE,
POCASI_CZ_API_ID,
POCASI_CZ_API_KEY,
POCASI_CZ_ENABLED,
POCASI_CZ_LOGGER_ENABLED,
POCASI_CZ_SEND_INTERVAL,
POCASI_CZ_SEND_MINIMUM,
SENSORS_TO_LOAD,
WINDY_API_KEY,
WINDY_ENABLED,
WINDY_LOGGER_ENABLED,
WINDY_STATION_ID,
WINDY_STATION_PW,
WSLINK,
WSLINK_ADDON_PORT,
)
from .utils import long_term_units_in_statistics_meta, migrate_data
_LOGGER = logging.getLogger(__name__)
class CannotConnect(HomeAssistantError):
@ -52,24 +56,19 @@ class ConfigOptionsFlowHandler(OptionsFlow):
self.user_data_schema = {}
self.sensors: dict[str, Any] = {}
self.migrate_schema = {}
self.migrate_sensor_select = {}
self.migrate_unit_selection = {}
self.count = 0
self.selected_sensor = ""
self.unit_values = [unit.value for unit in UnitOfVolumetricFlux]
self.unit_values.extend([unit.value for unit in UnitOfPrecipitationDepth])
@property
def config_entry(self):
return self.hass.config_entries.async_get_entry(self.handler)
self.pocasi_cz: dict[str, Any] = {}
self.pocasi_cz_schema = {}
self.ecowitt: dict[str, Any] = {}
self.ecowitt_schema = {}
self.wslink_addon_port: dict[str, int] = {}
self.wslink_addod_schema = {}
async def _get_entry_data(self):
"""Get entry data."""
self.user_data: dict[str, Any] = {
API_ID: self.config_entry.options.get(API_ID),
API_KEY: self.config_entry.options.get(API_KEY),
self.user_data = {
API_ID: self.config_entry.options.get(API_ID, ""),
API_KEY: self.config_entry.options.get(API_KEY, ""),
WSLINK: self.config_entry.options.get(WSLINK, False),
DEV_DBG: self.config_entry.options.get(DEV_DBG, False),
}
@ -81,70 +80,73 @@ class ConfigOptionsFlowHandler(OptionsFlow):
vol.Optional(DEV_DBG, default=self.user_data.get(DEV_DBG, False)): bool,
}
self.sensors: dict[str, Any] = {
SENSORS_TO_LOAD: self.config_entry.options.get(SENSORS_TO_LOAD)
if isinstance(self.config_entry.options.get(SENSORS_TO_LOAD), list)
else []
self.sensors = {
SENSORS_TO_LOAD: (
self.config_entry.options.get(SENSORS_TO_LOAD)
if isinstance(self.config_entry.options.get(SENSORS_TO_LOAD), list)
else []
)
}
self.windy_data: dict[str, Any] = {
WINDY_API_KEY: self.config_entry.options.get(WINDY_API_KEY),
self.windy_data = {
WINDY_STATION_ID: self.config_entry.options.get(WINDY_STATION_ID, ""),
WINDY_STATION_PW: self.config_entry.options.get(WINDY_STATION_PW, ""),
WINDY_LOGGER_ENABLED: self.config_entry.options.get(WINDY_LOGGER_ENABLED, False),
WINDY_ENABLED: self.config_entry.options.get(WINDY_ENABLED, False),
WINDY_LOGGER_ENABLED: self.config_entry.options.get(
WINDY_LOGGER_ENABLED, False
),
}
self.windy_data_schema = {
vol.Optional(WINDY_STATION_ID, default=self.windy_data.get(WINDY_STATION_ID, "")): str,
vol.Optional(
WINDY_API_KEY, default=self.windy_data.get(WINDY_API_KEY, "")
WINDY_STATION_PW,
default=self.windy_data.get(WINDY_STATION_PW, ""),
): str,
vol.Optional(WINDY_ENABLED, default=self.windy_data[WINDY_ENABLED]): bool
or False,
vol.Optional(WINDY_ENABLED, default=self.windy_data[WINDY_ENABLED]): bool or False,
vol.Optional(
WINDY_LOGGER_ENABLED,
default=self.windy_data[WINDY_LOGGER_ENABLED],
): bool or False,
}
self.migrate_sensor_select = {
vol.Required(SENSOR_TO_MIGRATE): vol.In(
await self.load_sensors_to_migrate() or {}
),
self.pocasi_cz = {
POCASI_CZ_API_ID: self.config_entry.options.get(POCASI_CZ_API_ID, ""),
POCASI_CZ_API_KEY: self.config_entry.options.get(POCASI_CZ_API_KEY, ""),
POCASI_CZ_ENABLED: self.config_entry.options.get(POCASI_CZ_ENABLED, False),
POCASI_CZ_LOGGER_ENABLED: self.config_entry.options.get(POCASI_CZ_LOGGER_ENABLED, False),
POCASI_CZ_SEND_INTERVAL: self.config_entry.options.get(POCASI_CZ_SEND_INTERVAL, 30),
}
self.migrate_unit_selection = {
vol.Required(MIG_FROM): vol.In(self.unit_values),
vol.Required(MIG_TO): vol.In(self.unit_values),
vol.Optional("trigger_action", default=False): bool,
}
# "mm/d", "mm/h", "mm", "in/d", "in/h", "in"
async def load_sensors_to_migrate(self) -> dict[str, Any]:
"""Load sensors to migrate."""
sensor_statistics = await long_term_units_in_statistics_meta(self.hass)
entity_registry = er.async_get(self.hass)
sensors = entity_registry.entities.get_entries_for_config_entry_id(
self.config_entry.entry_id
)
return {
sensor.entity_id: f"{sensor.name or sensor.original_name} (current settings: {sensor.unit_of_measurement}, longterm stats unit: {sensor_statistics.get(sensor.entity_id)})"
for sensor in sensors
if sensor.unique_id in {"rain", "daily_rain"}
self.pocasi_cz_schema = {
vol.Required(POCASI_CZ_API_ID, default=self.pocasi_cz.get(POCASI_CZ_API_ID)): str,
vol.Required(POCASI_CZ_API_KEY, default=self.pocasi_cz.get(POCASI_CZ_API_KEY)): str,
vol.Required(
POCASI_CZ_SEND_INTERVAL,
default=self.pocasi_cz.get(POCASI_CZ_SEND_INTERVAL),
): int,
vol.Optional(POCASI_CZ_ENABLED, default=self.pocasi_cz.get(POCASI_CZ_ENABLED)): bool,
vol.Optional(
POCASI_CZ_LOGGER_ENABLED,
default=self.pocasi_cz.get(POCASI_CZ_LOGGER_ENABLED),
): bool,
}
async def async_step_init(self, user_input=None):
self.ecowitt = {
ECOWITT_WEBHOOK_ID: self.config_entry.options.get(ECOWITT_WEBHOOK_ID, ""),
ECOWITT_ENABLED: self.config_entry.options.get(ECOWITT_ENABLED, False),
}
self.wslink_addon_port = {WSLINK_ADDON_PORT: self.config_entry.options.get(WSLINK_ADDON_PORT, 443)}
async def async_step_init(self, user_input: dict[str, Any] = {}):
"""Manage the options - show menu first."""
_ = user_input
return self.async_show_menu(
step_id="init", menu_options=["basic", "windy", "migration"]
step_id="init", menu_options=["basic", "wslink_port_setup", "ecowitt", "windy", "pocasi"]
)
async def async_step_basic(self, user_input=None):
async def async_step_basic(self, user_input: Any = None):
"""Manage basic options - credentials."""
errors = {}
errors: dict[str, str] = {}
await self._get_entry_data()
@ -162,11 +164,7 @@ class ConfigOptionsFlowHandler(OptionsFlow):
elif user_input[API_KEY] == user_input[API_ID]:
errors["base"] = "valid_credentials_match"
else:
# retain windy data
user_input.update(self.windy_data)
# retain sensors
user_input.update(self.sensors)
user_input = self.retain_data(user_input)
return self.async_create_entry(title=DOMAIN, data=user_input)
@ -179,9 +177,9 @@ class ConfigOptionsFlowHandler(OptionsFlow):
errors=errors,
)
async def async_step_windy(self, user_input=None):
async def async_step_windy(self, user_input: Any = None):
"""Manage windy options."""
errors = {}
errors: dict[str, str] = {}
await self._get_entry_data()
@ -192,169 +190,131 @@ class ConfigOptionsFlowHandler(OptionsFlow):
errors=errors,
)
if (user_input[WINDY_ENABLED] is True) and (user_input[WINDY_API_KEY] == ""):
errors[WINDY_API_KEY] = "windy_key_required"
if (user_input[WINDY_ENABLED] is True) and (
(user_input[WINDY_STATION_ID] == "") or (user_input[WINDY_STATION_PW] == "")
):
errors[WINDY_STATION_ID] = "windy_key_required"
return self.async_show_form(
step_id="windy",
data_schema=vol.Schema(self.windy_data_schema),
errors=errors,
)
# retain user_data
user_input.update(self.user_data)
# retain senors
user_input.update(self.sensors)
user_input = self.retain_data(user_input)
return self.async_create_entry(title=DOMAIN, data=user_input)
async def async_step_migration(self, user_input=None):
"""Migrate sensors."""
async def async_step_pocasi(self, user_input: Any = None) -> ConfigFlowResult:
"""Handle the pocasi step."""
errors = {}
data_schema = vol.Schema(self.migrate_sensor_select)
data_schema.schema.update()
errors: dict[str, str] = {}
await self._get_entry_data()
if user_input is None:
return self.async_show_form(
step_id="migration",
data_schema=vol.Schema(self.migrate_sensor_select),
step_id="pocasi",
data_schema=vol.Schema(self.pocasi_cz_schema),
errors=errors,
description_placeholders={
"migration_status": "-",
"migration_count": "-",
},
)
self.selected_sensor = user_input.get(SENSOR_TO_MIGRATE)
if user_input.get(POCASI_CZ_SEND_INTERVAL, 0) < POCASI_CZ_SEND_MINIMUM:
errors[POCASI_CZ_SEND_INTERVAL] = "pocasi_send_minimum"
return await self.async_step_migration_units()
if user_input.get(POCASI_CZ_ENABLED):
if user_input.get(POCASI_CZ_API_ID) == "":
errors[POCASI_CZ_API_ID] = "pocasi_id_required"
if user_input.get(POCASI_CZ_API_KEY) == "":
errors[POCASI_CZ_API_KEY] = "pocasi_key_required"
async def async_step_migration_units(self, user_input=None):
"""Migrate unit step."""
registry = er.async_get(self.hass)
sensor_entry = registry.async_get(self.selected_sensor)
sensor_stats = await long_term_units_in_statistics_meta(self.hass)
default_unit = sensor_entry.unit_of_measurement if sensor_entry else None
if default_unit not in self.unit_values:
default_unit = self.unit_values[0]
data_schema = vol.Schema({
vol.Required(MIG_FROM, default=default_unit): vol.In(self.unit_values),
vol.Required(MIG_TO): vol.In(self.unit_values),
vol.Optional("trigger_action", default=False): bool,
})
if user_input is None:
if len(errors) > 0:
return self.async_show_form(
step_id="migration_units",
data_schema=data_schema,
errors={},
description_placeholders={
"migration_sensor": sensor_entry.original_name,
"migration_stats": sensor_stats.get(self.selected_sensor),
},
step_id="pocasi",
data_schema=vol.Schema(self.pocasi_cz_schema),
errors=errors,
)
if user_input.get("trigger_action"):
self.count = await migrate_data(
self.hass,
self.selected_sensor,
user_input.get(MIG_FROM),
user_input.get(MIG_TO),
)
registry.async_update_entity(self.selected_sensor,
unit_of_measurement=user_input.get(MIG_TO),
)
state = self.hass.states.get(self.selected_sensor)
if state:
_LOGGER.info("State attributes before update: %s", state.attributes)
attributes = dict(state.attributes)
attributes["unit_of_measurement"] = user_input.get(MIG_TO)
self.hass.states.async_set(self.selected_sensor, state.state, attributes)
_LOGGER.info("State attributes after update: %s", attributes)
options = {**self.config_entry.options, "reload_sensor": self.selected_sensor}
self.hass.config_entries.async_update_entry(self.config_entry, options=options)
await self.hass.config_entries.async_reload(self.config_entry.entry_id)
await self.hass.async_block_till_done()
_LOGGER.info("Migration complete for sensor %s: %s row updated, new measurement unit: %s, ",
self.selected_sensor,
self.count,
user_input.get(MIG_TO),
)
await self._get_entry_data()
sensor_entry = er.async_get(self.hass).async_get(self.selected_sensor)
sensor_stat = await self.load_sensors_to_migrate()
return self.async_show_form(
step_id="migration_complete",
data_schema=vol.Schema({}),
errors={},
description_placeholders={
"migration_sensor": sensor_entry.unit_of_measurement,
"migration_stats": sensor_stat.get(self.selected_sensor),
"migration_count": self.count,
},
)
# retain windy data
user_input.update(self.windy_data)
# retain user_data
user_input.update(self.user_data)
# retain senors
user_input.update(self.sensors)
user_input = self.retain_data(user_input)
return self.async_create_entry(title=DOMAIN, data=user_input)
async def async_step_migration_complete(self, user_input=None):
"""Migration complete."""
errors = {}
async def async_step_ecowitt(self, user_input: Any = None) -> ConfigFlowResult:
"""Ecowitt stations setup."""
errors: dict[str, str] = {}
await self._get_entry_data()
sensor_entry = er.async_get(self.hass).async_get(self.selected_sensor)
sensor_stat = await self.load_sensors_to_migrate()
if not (webhook := self.ecowitt.get(ECOWITT_WEBHOOK_ID)):
webhook = secrets.token_hex(8)
if user_input is None:
url: URL = URL(get_url(self.hass))
if not url.host:
url.host = "UNKNOWN"
ecowitt_schema = {
vol.Required(
ECOWITT_WEBHOOK_ID,
default=webhook,
): str,
vol.Optional(
ECOWITT_ENABLED,
default=self.ecowitt.get(ECOWITT_ENABLED, False),
): bool,
}
return self.async_show_form(
step_id="migration_complete",
data_schema=vol.Schema({}),
errors=errors,
step_id="ecowitt",
data_schema=vol.Schema(ecowitt_schema),
description_placeholders={
"migration_sensor": sensor_entry.unit_of_measurement,
"migration_stats": sensor_stat.get(self.selected_sensor),
"migration_count": self.count,
"url": url.host,
"port": str(url.port),
"webhook_id": webhook,
},
errors=errors,
)
# retain windy data
user_input.update(self.windy_data)
# retain user_data
user_input.update(self.user_data)
# retain senors
user_input.update(self.sensors)
user_input = self.retain_data(user_input)
return self.async_create_entry(title=DOMAIN, data=user_input)
async def async_step_wslink_port_setup(self, user_input: Any = None) -> ConfigFlowResult:
"""WSLink Addon port setup."""
class ConfigFlow(ConfigFlow, domain=DOMAIN):
errors: dict[str, str] = {}
await self._get_entry_data()
if not (port := self.wslink_addon_port.get(WSLINK_ADDON_PORT)):
port = 433
wslink_port_schema = {
vol.Required(WSLINK_ADDON_PORT, default=port): int,
}
if user_input is None:
return self.async_show_form(
step_id="wslink_port_setup",
data_schema=vol.Schema(wslink_port_schema),
errors=errors,
)
user_input = self.retain_data(user_input)
return self.async_create_entry(title=DOMAIN, data=user_input)
def retain_data(self, data: dict[str, Any]) -> dict[str, Any]:
"""Retain user_data."""
return {
**self.user_data,
**self.windy_data,
**self.pocasi_cz,
**self.sensors,
**self.ecowitt,
**self.wslink_addon_port,
**dict(data),
}
class ConfigFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Sencor SWS 12500 Weather Station."""
data_schema = {
@ -366,7 +326,7 @@ class ConfigFlow(ConfigFlow, domain=DOMAIN):
VERSION = 1
async def async_step_user(self, user_input=None):
async def async_step_user(self, user_input: Any = None):
"""Handle the initial step."""
if user_input is None:
await self.async_set_unique_id(DOMAIN)
@ -377,7 +337,7 @@ class ConfigFlow(ConfigFlow, domain=DOMAIN):
data_schema=vol.Schema(self.data_schema),
)
errors = {}
errors: dict[str, str] = {}
if user_input[API_ID] in INVALID_CREDENTIALS:
errors[API_ID] = "valid_credentials_api"
@ -386,9 +346,7 @@ class ConfigFlow(ConfigFlow, domain=DOMAIN):
elif user_input[API_KEY] == user_input[API_ID]:
errors["base"] = "valid_credentials_match"
else:
return self.async_create_entry(
title=DOMAIN, data=user_input, options=user_input
)
return self.async_create_entry(title=DOMAIN, data=user_input, options=user_input)
return self.async_show_form(
step_id="user",
@ -398,6 +356,6 @@ class ConfigFlow(ConfigFlow, domain=DOMAIN):
@staticmethod
@callback
def async_get_options_flow(config_entry) -> ConfigOptionsFlowHandler:
def async_get_options_flow(config_entry: ConfigEntry) -> ConfigOptionsFlowHandler:
"""Get the options flow for this handler."""
return ConfigOptionsFlowHandler()

View File

@ -3,35 +3,20 @@
from enum import StrEnum
from typing import Final
# Integration specific constants.
DOMAIN = "sws12500"
DEFAULT_URL = "/weatherstation/updateweatherstation.php"
WSLINK_URL = "/data/upload.php"
WINDY_URL = "https://stations.windy.com/pws/update/"
DATABASE_PATH = "/config/home-assistant_v2.db"
ICON = "mdi:weather"
DEV_DBG: Final = "dev_debug_checkbox"
# Common constants
API_KEY = "API_KEY"
API_ID = "API_ID"
SENSORS_TO_LOAD: Final = "sensors_to_load"
SENSOR_TO_MIGRATE: Final = "sensor_to_migrate"
DEV_DBG: Final = "dev_debug_checkbox"
WSLINK: Final = "wslink"
WINDY_API_KEY = "WINDY_API_KEY"
WINDY_ENABLED: Final = "windy_enabled_checkbox"
WINDY_LOGGER_ENABLED: Final = "windy_logger_checkbox"
WINDY_NOT_INSERTED: Final = "Data was succefuly sent to Windy, but not inserted by Windy API. Does anyone else sent data to Windy?"
WINDY_INVALID_KEY: Final = "Windy API KEY is invalid. Send data to Windy is now disabled. Check your API KEY and try again."
WINDY_SUCCESS: Final = (
"Windy successfully sent data and data was successfully inserted by Windy API"
)
WINDY_UNEXPECTED: Final = (
"Windy responded unexpectedly 3 times in a row. Send to Windy is now disabled!"
)
INVALID_CREDENTIALS: Final = [
"API",
"API_ID",
@ -44,6 +29,69 @@ INVALID_CREDENTIALS: Final = [
"_KEY",
]
# Sensor constants
BARO_PRESSURE: Final = "baro_pressure"
OUTSIDE_TEMP: Final = "outside_temp"
DEW_POINT: Final = "dew_point"
OUTSIDE_HUMIDITY: Final = "outside_humidity"
OUTSIDE_CONNECTION: Final = "outside_connection"
OUTSIDE_BATTERY: Final = "outside_battery"
WIND_SPEED: Final = "wind_speed"
WIND_GUST: Final = "wind_gust"
WIND_DIR: Final = "wind_dir"
WIND_AZIMUT: Final = "wind_azimut"
RAIN: Final = "rain"
HOURLY_RAIN: Final = "hourly_rain"
WEEKLY_RAIN: Final = "weekly_rain"
MONTHLY_RAIN: Final = "monthly_rain"
YEARLY_RAIN: Final = "yearly_rain"
DAILY_RAIN: Final = "daily_rain"
SOLAR_RADIATION: Final = "solar_radiation"
INDOOR_TEMP: Final = "indoor_temp"
INDOOR_HUMIDITY: Final = "indoor_humidity"
INDOOR_BATTERY: Final = "indoor_battery"
UV: Final = "uv"
CH2_TEMP: Final = "ch2_temp"
CH2_HUMIDITY: Final = "ch2_humidity"
CH2_CONNECTION: Final = "ch2_connection"
CH2_BATTERY: Final = "ch2_battery"
CH3_TEMP: Final = "ch3_temp"
CH3_HUMIDITY: Final = "ch3_humidity"
CH3_CONNECTION: Final = "ch3_connection"
CH3_BATTERY: Final = "ch3_battery"
CH4_TEMP: Final = "ch4_temp"
CH4_HUMIDITY: Final = "ch4_humidity"
CH4_CONNECTION: Final = "ch4_connection"
CH4_BATTERY: Final = "ch4_battery"
CH5_TEMP: Final = "ch5_temp"
CH5_HUMIDITY: Final = "ch5_humidity"
CH5_CONNECTION: Final = "ch5_connection"
CH5_BATTERY: Final = "ch5_battery"
CH6_TEMP: Final = "ch6_temp"
CH6_HUMIDITY: Final = "ch6_humidity"
CH6_CONNECTION: Final = "ch6_connection"
CH6_BATTERY: Final = "ch6_battery"
CH7_TEMP: Final = "ch7_temp"
CH7_HUMIDITY: Final = "ch7_humidity"
CH7_CONNECTION: Final = "ch7_connection"
CH7_BATTERY: Final = "ch7_battery"
CH8_TEMP: Final = "ch8_temp"
CH8_HUMIDITY: Final = "ch8_humidity"
CH8_CONNECTION: Final = "ch8_connection"
CH8_BATTERY: Final = "ch8_battery"
HEAT_INDEX: Final = "heat_index"
CHILL_INDEX: Final = "chill_index"
WBGT_TEMP: Final = "wbgt_temp"
# Health specific constants
HEALTH_URL = "/station/health"
# PWS specific constants
DEFAULT_URL = "/weatherstation/updateweatherstation.php"
PURGE_DATA: Final = [
"ID",
"PASSWORD",
@ -57,40 +105,7 @@ PURGE_DATA: Final = [
"dailyrainin",
]
BARO_PRESSURE: Final = "baro_pressure"
OUTSIDE_TEMP: Final = "outside_temp"
DEW_POINT: Final = "dew_point"
OUTSIDE_HUMIDITY: Final = "outside_humidity"
OUTSIDE_CONNECTION: Final = "outside_connection"
WIND_SPEED: Final = "wind_speed"
WIND_GUST: Final = "wind_gust"
WIND_DIR: Final = "wind_dir"
WIND_AZIMUT: Final = "wind_azimut"
RAIN: Final = "rain"
HOURLY_RAIN: Final = "hourly_rain"
WEEKLY_RAIN: Final = "weekly_rain"
MONTHLY_RAIN: Final = "monthly_rain"
YEARLY_RAIN: Final = "yearly_rain"
DAILY_RAIN: Final = "daily_rain"
SOLAR_RADIATION: Final = "solar_radiation"
INDOOR_TEMP: Final = "indoor_temp"
INDOOR_HUMIDITY: Final = "indoor_humidity"
UV: Final = "uv"
CH2_TEMP: Final = "ch2_temp"
CH2_HUMIDITY: Final = "ch2_humidity"
CH2_CONNECTION: Final = "ch2_connection"
CH3_TEMP: Final = "ch3_temp"
CH3_HUMIDITY: Final = "ch3_humidity"
CH3_CONNECTION: Final = "ch3_connection"
CH4_TEMP: Final = "ch4_temp"
CH4_HUMIDITY: Final = "ch4_humidity"
CH4_CONNECTION: Final = "ch4_connection"
HEAT_INDEX: Final = "heat_index"
CHILL_INDEX: Final = "chill_index"
REMAP_ITEMS: dict = {
REMAP_ITEMS: dict[str, str] = {
"baromin": BARO_PRESSURE,
"tempf": OUTSIDE_TEMP,
"dewptf": DEW_POINT,
@ -110,9 +125,175 @@ REMAP_ITEMS: dict = {
"soilmoisture2": CH3_HUMIDITY,
"soiltemp3f": CH4_TEMP,
"soilmoisture3": CH4_HUMIDITY,
"soiltemp4f": CH5_TEMP,
"soilmoisture4": CH5_HUMIDITY,
"soiltemp5f": CH6_TEMP,
"soilmoisture5": CH6_HUMIDITY,
}
REMAP_WSLINK_ITEMS: dict = {
WSLINK_URL = "/data/upload.php"
WINDY_URL = "https://stations.windy.com/api/v2/observation/update"
POCASI_CZ_URL: Final = "http://ms.pocasimeteo.cz"
POCASI_CZ_SEND_MINIMUM: Final = 12 # minimal time to resend data
WSLINK: Final = "wslink"
WINDY_MAX_RETRIES: Final = 3
WSLINK_ADDON_PORT: Final = "WSLINK_ADDON_PORT"
__all__ = [
"DOMAIN",
"DEFAULT_URL",
"WSLINK_URL",
"HEALTH_URL",
"WINDY_URL",
"DATABASE_PATH",
"POCASI_CZ_URL",
"POCASI_CZ_SEND_MINIMUM",
"ICON",
"API_KEY",
"API_ID",
"SENSORS_TO_LOAD",
"SENSOR_TO_MIGRATE",
"DEV_DBG",
"WSLINK",
"ECOWITT",
"ECOWITT_WEBHOOK_ID",
"ECOWITT_ENABLED",
"POCASI_CZ_API_KEY",
"POCASI_CZ_API_ID",
"POCASI_CZ_SEND_INTERVAL",
"POCASI_CZ_ENABLED",
"POCASI_CZ_LOGGER_ENABLED",
"POCASI_INVALID_KEY",
"POCASI_CZ_SUCCESS",
"POCASI_CZ_UNEXPECTED",
"WINDY_STATION_ID",
"WINDY_STATION_PW",
"WINDY_ENABLED",
"WINDY_LOGGER_ENABLED",
"WINDY_NOT_INSERTED",
"WINDY_INVALID_KEY",
"WINDY_SUCCESS",
"WINDY_UNEXPECTED",
"INVALID_CREDENTIALS",
"PURGE_DATA",
"PURGE_DATA_POCAS",
"BARO_PRESSURE",
"OUTSIDE_TEMP",
"DEW_POINT",
"OUTSIDE_HUMIDITY",
"OUTSIDE_CONNECTION",
"OUTSIDE_BATTERY",
"WIND_SPEED",
"WIND_GUST",
"WIND_DIR",
"WIND_AZIMUT",
"RAIN",
"HOURLY_RAIN",
"WEEKLY_RAIN",
"MONTHLY_RAIN",
"YEARLY_RAIN",
"DAILY_RAIN",
"SOLAR_RADIATION",
"INDOOR_TEMP",
"INDOOR_HUMIDITY",
"INDOOR_BATTERY",
"UV",
"CH2_TEMP",
"CH2_HUMIDITY",
"CH2_CONNECTION",
"CH2_BATTERY",
"CH3_TEMP",
"CH3_HUMIDITY",
"CH3_CONNECTION",
"CH4_TEMP",
"CH4_HUMIDITY",
"CH4_CONNECTION",
"HEAT_INDEX",
"CHILL_INDEX",
"WBGT_TEMP",
"REMAP_ITEMS",
"REMAP_WSLINK_ITEMS",
"DISABLED_BY_DEFAULT",
"BATTERY_LIST",
"UnitOfDir",
"AZIMUT",
"UnitOfBat",
"BATTERY_LEVEL",
]
ECOWITT: Final = "ecowitt"
ECOWITT_WEBHOOK_ID: Final = "ecowitt_webhook_id"
ECOWITT_ENABLED: Final = "ecowitt_enabled"
POCASI_CZ_API_KEY = "POCASI_CZ_API_KEY"
POCASI_CZ_API_ID = "POCASI_CZ_API_ID"
POCASI_CZ_SEND_INTERVAL = "POCASI_SEND_INTERVAL"
POCASI_CZ_ENABLED = "pocasi_enabled_chcekbox"
POCASI_CZ_LOGGER_ENABLED = "pocasi_logger_checkbox"
POCASI_INVALID_KEY: Final = "Pocasi Meteo refused to accept data. Invalid ID/Key combination?"
POCASI_CZ_SUCCESS: Final = "Successfully sent data to Pocasi Meteo"
POCASI_CZ_UNEXPECTED: Final = "Pocasti Meteo responded unexpectedly 3 times in row. Resendig is now disabled!"
WINDY_STATION_ID = "WINDY_STATION_ID"
WINDY_STATION_PW = "WINDY_STATION_PWD"
WINDY_ENABLED: Final = "windy_enabled_checkbox"
WINDY_LOGGER_ENABLED: Final = "windy_logger_checkbox"
WINDY_NOT_INSERTED: Final = "Windy responded with 400 error. Invalid ID/password combination?"
WINDY_INVALID_KEY: Final = (
"Windy API KEY is invalid. Send data to Windy is now disabled. Check your API KEY and try again."
)
WINDY_SUCCESS: Final = "Windy successfully sent data and data was successfully inserted by Windy API"
WINDY_UNEXPECTED: Final = "Windy responded unexpectedly 3 times in a row. Send to Windy is now disabled!"
PURGE_DATA_POCAS: Final = [
"ID",
"PASSWORD",
"action",
"rtfreq",
]
"""NOTE: These are sensors that should be available with PWS protocol acording to https://support.weather.com/s/article/PWS-Upload-Protocol?language=en_US:
I have no option to test, if it will work correctly. So their implementatnion will be in future releases.
leafwetness - [%]
+ for sensor 2 use leafwetness2
visibility - [nm visibility]
pweather - [text] -- metar style (+RA)
clouds - [text] -- SKC, FEW, SCT, BKN, OVC
Pollution Fields:
AqNO - [ NO (nitric oxide) ppb ]
AqNO2T - (nitrogen dioxide), true measure ppb
AqNO2 - NO2 computed, NOx-NO ppb
AqNO2Y - NO2 computed, NOy-NO ppb
AqNOX - NOx (nitrogen oxides) - ppb
AqNOY - NOy (total reactive nitrogen) - ppb
AqNO3 - NO3 ion (nitrate, not adjusted for ammonium ion) UG/M3
AqSO4 - SO4 ion (sulfate, not adjusted for ammonium ion) UG/M3
AqSO2 - (sulfur dioxide), conventional ppb
AqSO2T - trace levels ppb
AqCO - CO (carbon monoxide), conventional ppm
AqCOT -CO trace levels ppb
AqEC - EC (elemental carbon) PM2.5 UG/M3
AqOC - OC (organic carbon, not adjusted for oxygen and hydrogen) PM2.5 UG/M3
AqBC - BC (black carbon at 880 nm) UG/M3
AqUV-AETH - UV-AETH (second channel of Aethalometer at 370 nm) UG/M3
AqPM2.5 - PM2.5 mass - UG/M3
AqPM10 - PM10 mass - PM10 mass
AqOZONE - Ozone - ppb
"""
REMAP_WSLINK_ITEMS: dict[str, str] = {
"intem": INDOOR_TEMP,
"inhum": INDOOR_HUMIDITY,
"t1tem": OUTSIDE_TEMP,
@ -131,28 +312,137 @@ REMAP_WSLINK_ITEMS: dict = {
"t1cn": OUTSIDE_CONNECTION,
"t234c1cn": CH2_CONNECTION,
"t234c2cn": CH3_CONNECTION,
"t234c3cn": CH4_CONNECTION,
"t234c4cn": CH5_CONNECTION,
"t234c5cn": CH6_CONNECTION,
"t234c6cn": CH7_CONNECTION,
"t234c7cn": CH8_CONNECTION,
"t1chill": CHILL_INDEX,
"t1heat": HEAT_INDEX,
"t1rainhr": HOURLY_RAIN,
"t1rainwy": WEEKLY_RAIN,
"t1rainmth": MONTHLY_RAIN,
"t1rainyr": YEARLY_RAIN,
"t234c2tem": CH3_TEMP,
"t234c2hum": CH3_HUMIDITY,
"t234c3tem": CH4_TEMP,
"t234c3hum": CH4_HUMIDITY,
"t234c4tem": CH5_TEMP,
"t234c4hum": CH5_HUMIDITY,
"t234c5tem": CH6_TEMP,
"t234c5hum": CH6_HUMIDITY,
"t234c6tem": CH7_TEMP,
"t234c6hum": CH7_HUMIDITY,
"t234c7tem": CH8_TEMP,
"t234c7hum": CH8_HUMIDITY,
"t1bat": OUTSIDE_BATTERY,
"inbat": INDOOR_BATTERY,
"t234c1bat": CH2_BATTERY,
"t234c2bat": CH3_BATTERY,
"t234c3bat": CH4_BATTERY,
"t234c4bat": CH5_BATTERY,
"t234c5bat": CH6_BATTERY,
"t234c6bat": CH7_BATTERY,
"t234c7bat": CH8_BATTERY,
"t1wbgt": WBGT_TEMP,
}
# TODO: Add more sensors
# NOTE: Add more sensors
#
# 'inbat' indoor battery level (1 normal, 0 low)
# 't1bat': outdoor battery level (1 normal, 0 low)
# 't234c1bat': CH2 battery level (1 normal, 0 low) CH2 in integration is CH1 in WSLink
# 't234c1bat': CH2 battery level (1 normal, 0 low) CH2 in integration is CH1 in WSLin
#
# In the following there are sensors that should be available by WSLink.
# We need to compare them to PWS API to make sure, we have the same intarnal
# representation of same sensors.
### TODO: These are sensors, that should be supported in WSLink API according to their API documentation:
# &t5lst= Last Lightning strike time integer
# &t5lskm= Lightning distance integer km
# &t5lsf= Lightning strike count last 1 Hours integer
# &t5ls5mtc= Lightning count total of during 5 minutes integer
# &t5ls30mtc= Lightning count total of during 30 minutes integer
# &t5ls1htc= Lightning count total of during 1 Hour integer
# &t5ls1dtc= Lightning count total of during 1 day integer
# &t5lsbat= Lightning Sensor battery (Normal=1, Low battery=0) integer
# &t5lscn= Lightning Sensor connection (Connected=1, No connect=0) integer
# &t6c1wls= Water leak sensor CH1 (Leak=1, No leak=0) integer
# &t6c1bat= Water leak sensor CH1 battery (Normal=1, Low battery=0) integer
# &t6c1cn= Water leak sensor CH1 connection (Connected=1, No connect=0) integer
# &t6c2wls= Water leak sensor CH2 (Leak=1, No leak=0) integer
# &t6c2bat= Water leak sensor CH2 battery (Normal=1, Low battery=0) integer
# &t6c2cn= Water leak sensor CH2 connection (Connected=1, No connect=0) integer
# &t6c3wls= Water leak sensor CH3 (Leak=1, No leak=0) integer
# &t6c3bat= Water leak sensor CH3 battery (Normal=1, Low battery=0) integer
# &t6c3cn= Water leak sensor CH3 connection (Connected=1, No connect=0) integer
# &t6c4wls= Water leak sensor CH4 (Leak=1, No leak=0) integer
# &t6c4bat= Water leak sensor CH4 battery (Normal=1, Low battery=0) integer
# &t6c4cn= Water leak sensor CH4 connection (Connected=1, No connect=0) integer
# &t6c5wls= Water leak sensor CH5 (Leak=1, No leak=0) integer
# &t6c5bat= Water leak sensor CH5 battery (Normal=1, Low battery=0) integer
# &t6c5cn= Water leak sensor CH5 connection (Connected=1, No connect=0) integer
# &t6c6wls= Water leak sensor CH6 (Leak=1, No leak=0) integer
# &t6c6bat= Water leak sensor CH6 battery (Normal=1, Low battery=0) integer
# &t6c6cn= Water leak sensor CH6 connection (Connected=1, No connect=0) integer
# &t6c7wls= Water leak sensor CH7 (Leak=1, No leak=0) integer
# &t6c7bat= Water leak sensor CH7 battery (Normal=1, Low battery=0) integer
# &t6c7cn= Water leak sensor CH7 connection (Connected=1, No connect=0) integer
# &t8pm25= PM2.5 concentration integer ug/m3
# &t8pm10= PM10 concentration integer ug/m3
# &t8pm25ai= PM2.5 AQI integer
# &t8pm10ai = PM10 AQI integer
# &t8bat= PM sensor battery level (0~5) remark: 5 is full integer
# &t8cn= PM sensor connection (Connected=1, No connect=0) integer
# &t9hcho= HCHO concentration integer ppb
# &t9voclv= VOC level (1~5) 1 is the highest level, 5 is the lowest VOC level integer
# &t9bat= HCHO / VOC sensor battery level (0~5) remark: 5 is full integer
# &t9cn= HCHO / VOC sensor connection (Connected=1, No connect=0) integer
# &t10co2= CO2 concentration integer ppm
# &t10bat= CO2 sensor battery level (0~5) remark: 5 is full integer
# &t10cn= CO2 sensor connection (Connected=1, No connect=0) integer
# &t11co= CO concentration integer ppm
# &t11bat= CO sensor battery level (0~5) remark: 5 is full integer
# &t11cn= CO sensor connection (Connected=1, No connect=0) integer
#
DISABLED_BY_DEFAULT: Final = [
CH2_TEMP,
CH2_HUMIDITY,
CH2_BATTERY,
CH3_TEMP,
CH3_HUMIDITY,
CH3_BATTERY,
CH4_TEMP,
CH4_HUMIDITY,
CH4_BATTERY,
CH5_TEMP,
CH5_HUMIDITY,
CH5_BATTERY,
CH6_TEMP,
CH6_HUMIDITY,
CH6_BATTERY,
CH7_TEMP,
CH7_HUMIDITY,
CH7_BATTERY,
CH8_TEMP,
CH8_HUMIDITY,
CH8_BATTERY,
OUTSIDE_BATTERY,
WBGT_TEMP,
]
BATTERY_LIST = [
OUTSIDE_BATTERY,
INDOOR_BATTERY,
CH2_BATTERY,
CH2_BATTERY,
CH3_BATTERY,
CH4_BATTERY,
CH5_BATTERY,
CH6_BATTERY,
CH7_BATTERY,
CH8_BATTERY,
]
@ -195,3 +485,18 @@ AZIMUT: list[UnitOfDir] = [
UnitOfDir.NNW,
UnitOfDir.N,
]
class UnitOfBat(StrEnum):
"""Battery level unit of measure."""
LOW = "low"
NORMAL = "normal"
UNKNOWN = "drained"
BATTERY_LEVEL: list[UnitOfBat] = [
UnitOfBat.LOW,
UnitOfBat.NORMAL,
UnitOfBat.UNKNOWN,
]

View File

@ -0,0 +1,21 @@
"""Shared keys for storing integration runtime state in `hass.data`.
This integration stores runtime state under:
hass.data[DOMAIN][entry_id] -> dict
Keeping keys in a dedicated module prevents subtle bugs where different modules
store different types under the same key.
"""
from __future__ import annotations
from typing import Final
# Per-entry dict keys stored under hass.data[DOMAIN][entry_id]
ENTRY_COORDINATOR: Final[str] = "coordinator"
ENTRY_ADD_ENTITIES: Final[str] = "async_add_entities"
ENTRY_DESCRIPTIONS: Final[str] = "sensor_descriptions"
ENTRY_LAST_OPTIONS: Final[str] = "last_options"
ENTRY_HEALTH_COORD: Final[str] = "coord_h"
ENTRY_HEALTH_DATA: Final[str] = "health_data"

View File

@ -0,0 +1,63 @@
"""Diagnostics support for the SWS12500 integration."""
from __future__ import annotations
from copy import deepcopy
from typing import Any
from py_typecheck import checked, checked_or
from homeassistant.components.diagnostics import (
async_redact_data, # pyright: ignore[reportUnknownVariableType]
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from .const import (
API_ID,
API_KEY,
DOMAIN,
POCASI_CZ_API_ID,
POCASI_CZ_API_KEY,
WINDY_STATION_ID,
WINDY_STATION_PW,
)
from .data import ENTRY_HEALTH_COORD, ENTRY_HEALTH_DATA
TO_REDACT = {
API_ID,
API_KEY,
POCASI_CZ_API_ID,
POCASI_CZ_API_KEY,
WINDY_STATION_ID,
WINDY_STATION_PW,
"ID",
"PASSWORD",
"wsid",
"wspw",
}
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: ConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
data = checked_or(hass.data.get(DOMAIN), dict[str, Any], {})
if (entry_data := checked(data.get(entry.entry_id), dict[str, Any])) is None:
entry_data = {}
health_data = checked(entry_data.get(ENTRY_HEALTH_DATA), dict[str, Any])
if health_data is None:
coordinator = entry_data.get(ENTRY_HEALTH_COORD)
health_data = getattr(coordinator, "data", None)
return {
"entry_data": async_redact_data(dict(entry.data), TO_REDACT),
"entry_options": async_redact_data(dict(entry.options), TO_REDACT),
"health_data": async_redact_data(
deepcopy(health_data) if health_data else {},
TO_REDACT,
),
}

View File

@ -0,0 +1,335 @@
"""Health and diagnostics coordinator for the SWS12500 integration.
This module owns the integration's runtime health model. The intent is to keep
all support/debug state in one place so it can be surfaced consistently via:
- diagnostic entities (`health_sensor.py`)
- diagnostics download (`diagnostics.py`)
- the `/station/health` HTTP endpoint
The coordinator is intentionally separate from the weather data coordinator.
Weather payload handling is push-based, while health metadata is lightweight
polling plus event-driven updates (route dispatch, ingress result, forwarding).
"""
from __future__ import annotations
from asyncio import timeout
from copy import deepcopy
from datetime import timedelta
import logging
from typing import Any
import aiohttp
from aiohttp import ClientConnectionError
import aiohttp.web
from py_typecheck import checked, checked_or
from homeassistant.components.network import async_get_source_ip
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.network import get_url
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from homeassistant.util import dt as dt_util
from .const import (
DEFAULT_URL,
DOMAIN,
HEALTH_URL,
POCASI_CZ_ENABLED,
WINDY_ENABLED,
WSLINK,
WSLINK_ADDON_PORT,
WSLINK_URL,
)
from .data import ENTRY_HEALTH_DATA
from .pocasti_cz import PocasiPush
from .routes import Routes
from .windy_func import WindyPush
_LOGGER = logging.getLogger(__name__)
def _protocol_name(wslink_enabled: bool) -> str:
"""Return the configured protocol name."""
return "wslink" if wslink_enabled else "wu"
def _protocol_from_path(path: str) -> str:
"""Infer an ingress protocol label from a request path."""
if path == WSLINK_URL:
return "wslink"
if path == DEFAULT_URL:
return "wu"
if path == HEALTH_URL:
return "health"
return "unknown"
def _empty_forwarding_state(enabled: bool) -> dict[str, Any]:
"""Build the default forwarding status payload."""
return {
"enabled": enabled,
"last_status": "disabled" if not enabled else "idle",
"last_error": None,
"last_attempt_at": None,
}
def _default_health_data(config: ConfigEntry) -> dict[str, Any]:
"""Build the default health/debug payload for this config entry."""
configured_protocol = _protocol_name(checked_or(config.options.get(WSLINK), bool, False))
return {
"integration_status": f"online_{configured_protocol}",
"configured_protocol": configured_protocol,
"active_protocol": configured_protocol,
"addon": {
"online": False,
"health_endpoint": "/healthz",
"info_endpoint": "/status/internal",
"name": None,
"version": None,
"listen_port": None,
"tls": None,
"upstream_ha_port": None,
"paths": {
"wslink": WSLINK_URL,
"wu": DEFAULT_URL,
},
"raw_status": None,
},
"routes": {
"wu_enabled": False,
"wslink_enabled": False,
"health_enabled": False,
"snapshot": {},
},
"last_ingress": {
"time": None,
"protocol": "unknown",
"path": None,
"method": None,
"route_enabled": False,
"accepted": False,
"authorized": None,
"reason": "no_data",
},
"forwarding": {
"windy": _empty_forwarding_state(checked_or(config.options.get(WINDY_ENABLED), bool, False)),
"pocasi": _empty_forwarding_state(checked_or(config.options.get(POCASI_CZ_ENABLED), bool, False)),
},
}
class HealthCoordinator(DataUpdateCoordinator):
"""Maintain the integration health snapshot.
The coordinator combines:
- periodic add-on reachability checks
- live ingress observations from the HTTP dispatcher
- ingress processing results from the main webhook handler
- forwarding status from Windy/Pocasi helpers
All of that is stored as one structured JSON-like dict in `self.data`.
"""
def __init__(self, hass: HomeAssistant, config: ConfigEntry) -> None:
"""Initialize the health coordinator."""
self.hass: HomeAssistant = hass
self.config: ConfigEntry = config
super().__init__(
hass,
logger=_LOGGER,
name=f"{DOMAIN}_health",
update_interval=timedelta(minutes=1),
)
self.data: dict[str, Any] = _default_health_data(config)
def _store_runtime_health(self, data: dict[str, Any]) -> None:
"""Persist the latest health payload into entry runtime storage."""
if (domain := checked(self.hass.data.get(DOMAIN), dict[str, Any])) is None:
return
if (entry := checked(domain.get(self.config.entry_id), dict[str, Any])) is None:
return
entry[ENTRY_HEALTH_DATA] = deepcopy(data)
def _commit(self, data: dict[str, Any]) -> dict[str, Any]:
"""Publish a new health snapshot."""
self.async_set_updated_data(data)
self._store_runtime_health(data)
return data
def _refresh_summary(self, data: dict[str, Any]) -> None:
"""Derive top-level integration status from the detailed health payload."""
configured_protocol = data.get("configured_protocol", "wu")
ingress = data.get("last_ingress", {})
last_protocol = ingress.get("protocol", "unknown")
accepted = bool(ingress.get("accepted"))
reason = ingress.get("reason")
if (reason in {"route_disabled", "route_not_registered", "unauthorized"}) or (
last_protocol in {"wu", "wslink"} and last_protocol != configured_protocol
):
integration_status = "degraded"
elif accepted and last_protocol in {"wu", "wslink"}:
integration_status = f"online_{last_protocol}"
else:
integration_status = "online_idle"
data["integration_status"] = integration_status
data["active_protocol"] = (
last_protocol if accepted and last_protocol in {"wu", "wslink"} else configured_protocol
)
async def _async_update_data(self) -> dict[str, Any]:
"""Refresh add-on health metadata from the WSLink proxy."""
session = async_get_clientsession(self.hass, False)
url = get_url(self.hass)
ip = await async_get_source_ip(self.hass)
port = checked_or(self.config_entry.options.get(WSLINK_ADDON_PORT), int, 443)
health_url = f"https://{ip}:{port}/healthz"
info_url = f"https://{ip}:{port}/status/internal"
data = deepcopy(self.data)
addon = data["addon"]
addon["health_url"] = health_url
addon["info_url"] = info_url
addon["home_assistant_url"] = url
addon["home_assistant_source_ip"] = str(ip)
addon["online"] = False
try:
async with timeout(5), session.get(health_url) as response:
addon["online"] = checked(response.status, int) == 200
except ClientConnectionError:
addon["online"] = False
raw_status: dict[str, Any] | None = None
if addon["online"]:
try:
async with timeout(5), session.get(info_url) as info_response:
if checked(info_response.status, int) == 200:
raw_status = await info_response.json(content_type=None)
except (ClientConnectionError, aiohttp.ContentTypeError, ValueError):
raw_status = None
addon["raw_status"] = raw_status
if raw_status:
addon["name"] = raw_status.get("addon")
addon["version"] = raw_status.get("version")
addon["listen_port"] = raw_status.get("listen", {}).get("port")
addon["tls"] = raw_status.get("listen", {}).get("tls")
addon["upstream_ha_port"] = raw_status.get("upstream", {}).get("ha_port")
addon["paths"] = {
"wslink": raw_status.get("paths", {}).get("wslink", WSLINK_URL),
"wu": raw_status.get("paths", {}).get("wu", DEFAULT_URL),
}
self._refresh_summary(data)
return self._commit(data)
def update_routing(self, routes: Routes | None) -> None:
"""Store the currently enabled routes for diagnostics."""
data = deepcopy(self.data)
data["configured_protocol"] = _protocol_name(checked_or(self.config.options.get(WSLINK), bool, False))
if routes is not None:
data["routes"] = {
"wu_enabled": routes.path_enabled(DEFAULT_URL),
"wslink_enabled": routes.path_enabled(WSLINK_URL),
"health_enabled": routes.path_enabled(HEALTH_URL),
"snapshot": routes.snapshot(),
}
self._refresh_summary(data)
self._commit(data)
def record_dispatch(self, request: aiohttp.web.Request, route_enabled: bool, reason: str | None) -> None:
"""Record every ingress observed by the dispatcher.
This runs before the actual webhook handler. It lets diagnostics answer:
- which endpoint the station is calling
- whether the route was enabled
- whether the request was rejected before processing
"""
# We do not want to proccess health requests
if request.path == HEALTH_URL:
return
data = deepcopy(self.data)
data["last_ingress"] = {
"time": dt_util.utcnow().isoformat(),
"protocol": _protocol_from_path(request.path),
"path": request.path,
"method": request.method,
"route_enabled": route_enabled,
"accepted": False,
"authorized": None,
"reason": reason or "pending",
}
self._refresh_summary(data)
self._commit(data)
def update_ingress_result(
self,
request: aiohttp.web.Request,
*,
accepted: bool,
authorized: bool | None,
reason: str | None = None,
) -> None:
"""Store the final processing result of a webhook request."""
data = deepcopy(self.data)
ingress = data.get("last_ingress", {})
ingress.update(
{
"time": dt_util.utcnow().isoformat(),
"protocol": _protocol_from_path(request.path),
"path": request.path,
"method": request.method,
"accepted": accepted,
"authorized": authorized,
"reason": reason or ("accepted" if accepted else "rejected"),
}
)
data["last_ingress"] = ingress
self._refresh_summary(data)
self._commit(data)
def update_forwarding(self, windy: WindyPush, pocasi: PocasiPush) -> None:
"""Store forwarding subsystem statuses for diagnostics."""
data = deepcopy(self.data)
data["forwarding"] = {
"windy": {
"enabled": windy.enabled,
"last_status": windy.last_status,
"last_error": windy.last_error,
"last_attempt_at": windy.last_attempt_at,
},
"pocasi": {
"enabled": pocasi.enabled,
"last_status": pocasi.last_status,
"last_error": pocasi.last_error,
"last_attempt_at": pocasi.last_attempt_at,
},
}
self._refresh_summary(data)
self._commit(data)
async def health_status(self, _: aiohttp.web.Request) -> aiohttp.web.Response:
"""Serve the current health snapshot over HTTP.
The endpoint forces one refresh before returning so that the caller sees
a reasonably fresh add-on status.
"""
await self.async_request_refresh()
return aiohttp.web.json_response(self.data, status=200)

View File

@ -0,0 +1,273 @@
"""Health diagnostic sensors for SWS-12500."""
from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from functools import cached_property
from typing import Any, cast
from py_typecheck import checked, checked_or
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceEntryType
from homeassistant.helpers.entity import DeviceInfo
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.util import dt as dt_util
from .const import DOMAIN
from .data import ENTRY_HEALTH_COORD
@dataclass(frozen=True, kw_only=True)
class HealthSensorEntityDescription(SensorEntityDescription):
"""Description for health diagnostic sensors."""
data_path: tuple[str, ...]
value_fn: Callable[[Any], Any] | None = None
def _resolve_path(data: dict[str, Any], path: tuple[str, ...]) -> Any:
"""Resolve a nested path from a dictionary."""
current: Any = data
for key in path:
if checked(current, dict[str, Any]) is None:
return None
current = current.get(key)
return current
def _on_off(value: Any) -> str:
"""Render a boolean-ish value as `on` / `off`."""
return "on" if bool(value) else "off"
def _accepted_state(value: Any) -> str:
"""Render ingress acceptance state."""
return "accepted" if bool(value) else "rejected"
def _authorized_state(value: Any) -> str:
"""Render ingress authorization state."""
if value is None:
return "unknown"
return "authorized" if bool(value) else "unauthorized"
def _timestamp_or_none(value: Any) -> Any:
"""Convert ISO timestamp string to datetime for HA rendering."""
if not isinstance(value, str):
return None
return dt_util.parse_datetime(value)
HEALTH_SENSOR_DESCRIPTIONS: tuple[HealthSensorEntityDescription, ...] = (
HealthSensorEntityDescription(
key="integration_health",
translation_key="integration_health",
icon="mdi:heart-pulse",
data_path=("integration_status",),
),
HealthSensorEntityDescription(
key="active_protocol",
translation_key="active_protocol",
icon="mdi:swap-horizontal",
data_path=("active_protocol",),
),
HealthSensorEntityDescription(
key="wslink_addon_status",
translation_key="wslink_addon_status",
icon="mdi:server-network",
data_path=("addon", "online"),
value_fn=lambda value: "online" if value else "offline",
),
HealthSensorEntityDescription(
key="wslink_addon_name",
translation_key="wslink_addon_name",
icon="mdi:package-variant-closed",
data_path=("addon", "name"),
),
HealthSensorEntityDescription(
key="wslink_addon_version",
translation_key="wslink_addon_version",
icon="mdi:label-outline",
data_path=("addon", "version"),
),
HealthSensorEntityDescription(
key="wslink_addon_listen_port",
translation_key="wslink_addon_listen_port",
icon="mdi:lan-connect",
data_path=("addon", "listen_port"),
),
HealthSensorEntityDescription(
key="wslink_upstream_ha_port",
translation_key="wslink_upstream_ha_port",
icon="mdi:transit-connection-variant",
data_path=("addon", "upstream_ha_port"),
),
HealthSensorEntityDescription(
key="route_wu_enabled",
translation_key="route_wu_enabled",
icon="mdi:transit-connection-horizontal",
data_path=("routes", "wu_enabled"),
value_fn=_on_off,
),
HealthSensorEntityDescription(
key="route_wslink_enabled",
translation_key="route_wslink_enabled",
icon="mdi:transit-connection-horizontal",
data_path=("routes", "wslink_enabled"),
value_fn=_on_off,
),
HealthSensorEntityDescription(
key="last_ingress_time",
translation_key="last_ingress_time",
icon="mdi:clock-outline",
device_class=SensorDeviceClass.TIMESTAMP,
data_path=("last_ingress", "time"),
value_fn=_timestamp_or_none,
),
HealthSensorEntityDescription(
key="last_ingress_protocol",
translation_key="last_ingress_protocol",
icon="mdi:download-network",
data_path=("last_ingress", "protocol"),
),
HealthSensorEntityDescription(
key="last_ingress_route_enabled",
translation_key="last_ingress_route_enabled",
icon="mdi:check-network",
data_path=("last_ingress", "route_enabled"),
value_fn=_on_off,
),
HealthSensorEntityDescription(
key="last_ingress_accepted",
translation_key="last_ingress_accepted",
icon="mdi:check-decagram",
data_path=("last_ingress", "accepted"),
value_fn=_accepted_state,
),
HealthSensorEntityDescription(
key="last_ingress_authorized",
translation_key="last_ingress_authorized",
icon="mdi:key",
data_path=("last_ingress", "authorized"),
value_fn=_authorized_state,
),
HealthSensorEntityDescription(
key="last_ingress_reason",
translation_key="last_ingress_reason",
icon="mdi:message-alert-outline",
data_path=("last_ingress", "reason"),
),
HealthSensorEntityDescription(
key="forward_windy_enabled",
translation_key="forward_windy_enabled",
icon="mdi:weather-windy",
data_path=("forwarding", "windy", "enabled"),
value_fn=_on_off,
),
HealthSensorEntityDescription(
key="forward_windy_status",
translation_key="forward_windy_status",
icon="mdi:weather-windy",
data_path=("forwarding", "windy", "last_status"),
),
HealthSensorEntityDescription(
key="forward_pocasi_enabled",
translation_key="forward_pocasi_enabled",
icon="mdi:cloud-upload-outline",
data_path=("forwarding", "pocasi", "enabled"),
value_fn=_on_off,
),
HealthSensorEntityDescription(
key="forward_pocasi_status",
translation_key="forward_pocasi_status",
icon="mdi:cloud-upload-outline",
data_path=("forwarding", "pocasi", "last_status"),
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
async_add_entities: AddEntitiesCallback,
) -> None:
"""Set up health diagnostic sensors."""
if (data := checked(hass.data.get(DOMAIN), dict[str, Any])) is None:
return
if (entry_data := checked(data.get(entry.entry_id), dict[str, Any])) is None:
return
coordinator = entry_data.get(ENTRY_HEALTH_COORD)
if coordinator is None:
return
entities = [
HealthDiagnosticSensor(coordinator=coordinator, description=description)
for description in HEALTH_SENSOR_DESCRIPTIONS
]
async_add_entities(entities)
class HealthDiagnosticSensor( # pyright: ignore[reportIncompatibleVariableOverride]
CoordinatorEntity, SensorEntity
):
"""Health diagnostic sensor for SWS-12500."""
_attr_has_entity_name = True
_attr_should_poll = False
def __init__(
self,
coordinator: Any,
description: HealthSensorEntityDescription,
) -> None:
"""Initialize the sensor."""
super().__init__(coordinator)
self.entity_description = description
self._attr_entity_category = EntityCategory.DIAGNOSTIC
self._attr_unique_id = f"{description.key}_health"
@property
def native_value(self) -> Any: # pyright: ignore[reportIncompatibleVariableOverride]
"""Return the current diagnostic value."""
data = checked_or(self.coordinator.data, dict[str, Any], {})
description = cast("HealthSensorEntityDescription", self.entity_description)
value = _resolve_path(data, description.data_path)
if description.value_fn is not None:
return description.value_fn(value)
return value
@property
def extra_state_attributes(self) -> dict[str, Any] | None: # pyright: ignore[reportIncompatibleVariableOverride]
"""Expose the full health JSON on the main health sensor for debugging."""
if self.entity_description.key != "integration_health":
return None
return checked_or(self.coordinator.data, dict[str, Any], None)
@cached_property
def device_info(self) -> DeviceInfo:
"""Device info."""
return DeviceInfo(
connections=set(),
name="Weather Station SWS 12500",
entry_type=DeviceEntryType.SERVICE,
identifiers={(DOMAIN,)}, # type: ignore[arg-type]
manufacturer="Schizza",
model="Weather Station SWS 12500",
)

View File

@ -0,0 +1,14 @@
{
"entity": {
"sensor": {
"indoor_battery": {
"default": "mdi:battery-unknown",
"state": {
"low": "mdi:battery-low",
"normal": "mdi:battery",
"drained": "mdi:battery-alert"
}
}
}
}
}

View File

@ -8,8 +8,8 @@
"homekit": {},
"iot_class": "local_push",
"issue_tracker": "https://github.com/schizza/SWS-12500-custom-component/issues",
"requirements": [],
"requirements": ["typecheck-runtime==0.2.0"],
"ssdp": [],
"version": "1.6.2",
"version": "1.6.9",
"zeroconf": []
}

View File

@ -0,0 +1,176 @@
"""Pocasi CZ resend functions."""
from datetime import datetime, timedelta
import logging
from typing import Any, Literal
from aiohttp import ClientError
from py_typecheck.core import checked
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import (
DEFAULT_URL,
POCASI_CZ_API_ID,
POCASI_CZ_API_KEY,
POCASI_CZ_ENABLED,
POCASI_CZ_LOGGER_ENABLED,
POCASI_CZ_SEND_INTERVAL,
POCASI_CZ_SUCCESS,
POCASI_CZ_UNEXPECTED,
POCASI_CZ_URL,
POCASI_INVALID_KEY,
WSLINK_URL,
)
from .utils import anonymize, update_options
_LOGGER = logging.getLogger(__name__)
class PocasiNotInserted(Exception):
"""NotInserted state."""
class PocasiSuccess(Exception):
"""WindySucces state."""
class PocasiApiKeyError(Exception):
"""Windy API Key error."""
class PocasiPush:
"""Push data to Windy."""
def __init__(self, hass: HomeAssistant, config: ConfigEntry) -> None:
"""Init."""
self.hass = hass
self.config = config
self.enabled: bool = self.config.options.get(POCASI_CZ_ENABLED, False)
self.last_status: str = "disabled" if not self.enabled else "idle"
self.last_error: str | None = None
self.last_attempt_at: str | None = None
self._interval = int(self.config.options.get(POCASI_CZ_SEND_INTERVAL, 30))
self.last_update = datetime.now()
self.next_update = datetime.now() + timedelta(seconds=self._interval)
self.log = self.config.options.get(POCASI_CZ_LOGGER_ENABLED)
self.invalid_response_count = 0
def verify_response(
self,
response: str,
) -> PocasiNotInserted | PocasiSuccess | PocasiApiKeyError | None:
"""Verify answer form server."""
if self.log:
_LOGGER.debug("Pocasi CZ responded: %s", response)
# Server does not provide any responses.
# This is placeholder if future state is changed
return None
async def push_data_to_server(
self, data: dict[str, Any], mode: Literal["WU", "WSLINK"]
):
"""Pushes weather data to server."""
_data = data.copy()
self.enabled = self.config.options.get(POCASI_CZ_ENABLED, False)
self.last_attempt_at = datetime.now().isoformat()
self.last_error = None
if (_api_id := checked(self.config.options.get(POCASI_CZ_API_ID), str)) is None:
_LOGGER.error(
"No API ID is provided for Pocasi Meteo. Check your configuration."
)
self.last_status = "config_error"
self.last_error = "Missing API ID."
return
if (
_api_key := checked(self.config.options.get(POCASI_CZ_API_KEY), str)
) is None:
_LOGGER.error(
"No API Key is provided for Pocasi Meteo. Check your configuration."
)
self.last_status = "config_error"
self.last_error = "Missing API key."
return
if self.log:
_LOGGER.info(
"Pocasi CZ last update = %s, next update at: %s",
str(self.last_update),
str(self.next_update),
)
if self.next_update > datetime.now():
self.last_status = "rate_limited_local"
_LOGGER.debug(
"Triggered update interval limit of %s seconds. Next possilbe update is set to: %s",
self._interval,
self.next_update,
)
return
request_url: str = ""
if mode == "WSLINK":
_data["wsid"] = _api_id
_data["wspw"] = _api_key
request_url = f"{POCASI_CZ_URL}{WSLINK_URL}"
if mode == "WU":
_data["ID"] = _api_id
_data["PASSWORD"] = _api_key
request_url = f"{POCASI_CZ_URL}{DEFAULT_URL}"
session = async_get_clientsession(self.hass)
_LOGGER.debug(
"Payload for Pocasi Meteo server: [mode=%s] [request_url=%s] = %s",
mode,
request_url,
anonymize(_data),
)
try:
async with session.get(request_url, params=_data) as resp:
status = await resp.text()
try:
self.verify_response(status)
except PocasiApiKeyError:
# log despite of settings
self.last_status = "auth_error"
self.last_error = POCASI_INVALID_KEY
self.enabled = False
_LOGGER.critical(POCASI_INVALID_KEY)
await update_options(
self.hass, self.config, POCASI_CZ_ENABLED, False
)
except PocasiSuccess:
self.last_status = "ok"
self.last_error = None
if self.log:
_LOGGER.info(POCASI_CZ_SUCCESS)
else:
self.last_status = "ok"
except ClientError as ex:
self.last_status = "client_error"
self.last_error = str(ex)
_LOGGER.critical("Invalid response from Pocasi Meteo: %s", str(ex))
self.invalid_response_count += 1
if self.invalid_response_count > 3:
_LOGGER.critical(POCASI_CZ_UNEXPECTED)
self.enabled = False
await update_options(self.hass, self.config, POCASI_CZ_ENABLED, False)
self.last_update = datetime.now()
self.next_update = datetime.now() + timedelta(seconds=self._interval)
if self.log:
_LOGGER.info("Next update: %s", str(self.next_update))

View File

@ -1,76 +1,172 @@
"""Store routes info."""
"""Routes implementation.
from dataclasses import dataclass
from logging import getLogger
Why this dispatcher exists
--------------------------
Home Assistant registers aiohttp routes on startup. Re-registering or removing routes at runtime
is awkward and error-prone (and can raise if routes already exist). This integration supports two
different push endpoints (legacy WU-style vs WSLink). To allow switching between them without
touching the aiohttp router, we register both routes once and use this in-process dispatcher to
decide which one is currently enabled.
from aiohttp.web import AbstractRoute, Response
Important note:
- Each route stores a *bound method* handler (e.g. `coordinator.received_data`). That means the
route points to a specific coordinator instance. When the integration reloads, we must keep the
same coordinator instance or update the stored handler accordingly. Otherwise requests may go to
an old coordinator while entities listen to a new one (result: UI appears "frozen").
"""
_LOGGER = getLogger(__name__)
from collections.abc import Awaitable, Callable
from dataclasses import dataclass, field
import logging
from typing import Any
from aiohttp.web import AbstractRoute, Request, Response
_LOGGER = logging.getLogger(__name__)
Handler = Callable[[Request], Awaitable[Response]]
IngressObserver = Callable[[Request, bool, str | None], None]
@dataclass
class Route:
"""Store route info."""
class RouteInfo:
"""Route definition held by the dispatcher.
- `handler` is the real webhook handler (bound method).
- `fallback` is used when the route exists but is currently disabled.
"""
url_path: str
route: AbstractRoute
handler: callable
handler: Handler
enabled: bool = False
sticky: bool = False
fallback: Handler = field(default_factory=lambda: unregistered)
def __str__(self):
"""Return string representation."""
return f"{self.url_path} -> {self.handler}"
return f"RouteInfo(url_path={self.url_path}, route={self.route}, handler={self.handler}, enabled={self.enabled}, fallback={self.fallback})"
class Routes:
"""Store routes info."""
"""Simple route dispatcher.
We register aiohttp routes once and direct traffic to the currently enabled endpoint
using `switch_route`. This keeps route registration stable while still allowing the
integration to support multiple incoming push formats.
"""
def __init__(self) -> None:
"""Initialize routes."""
self.routes = {}
"""Initialize dispatcher storage."""
self.routes: dict[str, RouteInfo] = {}
self._ingress_observer: IngressObserver | None = None
def switch_route(self, coordinator: callable, url_path: str):
"""Switch route."""
def set_ingress_observer(self, observer: IngressObserver | None) -> None:
"""Set a callback notified for every incoming dispatcher request."""
self._ingress_observer = observer
for url, route in self.routes.items():
if url == url_path:
_LOGGER.info("New coordinator to route: %s", route.url_path)
async def dispatch(self, request: Request) -> Response:
"""Dispatch incoming request to either the enabled handler or a fallback."""
key = f"{request.method}:{request.path}"
info = self.routes.get(key)
if not info:
_LOGGER.debug(
"Route (%s):%s is not registered!", request.method, request.path
)
if self._ingress_observer is not None:
self._ingress_observer(request, False, "route_not_registered")
return await unregistered(request)
if self._ingress_observer is not None:
self._ingress_observer(
request,
info.enabled,
None if info.enabled else "route_disabled",
)
handler = info.handler if info.enabled else info.fallback
return await handler(request)
def switch_route(self, handler: Handler, url_path: str) -> None:
"""Enable routes based on URL, disable all others. Leave sticky routes enabled.
This is called when options change (e.g. WSLink toggle). The aiohttp router stays
untouched; we only flip which internal handler is active.
"""
for route in self.routes.values():
if route.sticky:
continue
if route.url_path == url_path:
_LOGGER.info(
"New coordinator to route: (%s):%s",
route.route.method,
route.url_path,
)
route.enabled = True
route.handler = coordinator
route.route._handler = coordinator # noqa: SLF001
route.handler = handler
else:
route.enabled = False
route.handler = unregistred
route.route._handler = unregistred # noqa: SLF001
route.handler = unregistered
def add_route(
self,
url_path: str,
route: AbstractRoute,
handler: callable,
handler: Handler,
*,
enabled: bool = False,
):
"""Add route."""
self.routes[url_path] = Route(url_path, route, handler, enabled)
sticky: bool = False,
) -> None:
"""Register a route in the dispatcher.
def get_route(self, url_path: str) -> Route:
"""Get route."""
return self.routes.get(url_path)
This does not register anything in aiohttp. It only stores routing metadata that
`dispatch` uses after aiohttp has routed the request by path.
"""
key = f"{route.method}:{url_path}"
self.routes[key] = RouteInfo(
url_path, route=route, handler=handler, enabled=enabled, sticky=sticky
)
_LOGGER.debug("Registered dispatcher for route (%s):%s", route.method, url_path)
def get_enabled(self) -> str:
"""Get enabled routes."""
enabled_routes = [
route.url_path for route in self.routes.values() if route.enabled
]
return "".join(enabled_routes) if enabled_routes else "None"
def show_enabled(self) -> str:
"""Return a human-readable description of the currently enabled route."""
def __str__(self):
"""Return string representation."""
return "\n".join([str(route) for route in self.routes.values()])
enabled_routes = {
f"Dispatcher enabled for ({route.route.method}):{route.url_path}, with handler: {route.handler}"
for route in self.routes.values()
if route.enabled
}
if not enabled_routes:
return "No routes are enabled."
return ", ".join(sorted(enabled_routes))
def path_enabled(self, url_path: str) -> bool:
"""Return whether any route registered for `url_path` is enabled."""
return any(
route.enabled for route in self.routes.values() if route.url_path == url_path
)
def snapshot(self) -> dict[str, Any]:
"""Return a compact routing snapshot for diagnostics."""
return {
key: {
"path": route.url_path,
"method": route.route.method,
"enabled": route.enabled,
"sticky": route.sticky,
}
for key, route in self.routes.items()
}
async def unregistred(*args, **kwargs):
"""Unregister path to handle incoming data."""
async def unregistered(request: Request) -> Response:
"""Fallback response for unknown/disabled routes.
_LOGGER.error("Recieved data to unregistred webhook. Check your settings")
return Response(body=f"{'Unregistred webhook.'}", status=404)
This should normally never happen for correctly configured stations, but it provides
a clear error message when the station pushes to the wrong endpoint.
"""
_ = request
_LOGGER.debug("Received data to unregistred or disabled webhook.")
return Response(text="Unregistred webhook. Check your settings.", status=400)

View File

@ -1,16 +1,36 @@
"""Sensors definition for SWS12500."""
"""Sensor platform for SWS12500.
This module creates sensor entities based on the config entry options.
The integration is push-based (webhook), so we avoid reloading the entry for
auto-discovered sensors. Instead, we dynamically add new entities at runtime
using the `async_add_entities` callback stored in `hass.data`.
Why not reload on auto-discovery?
Reloading a config entry unloads platforms temporarily, which removes coordinator
listeners. With frequent webhook pushes, this can create a window where nothing is
subscribed and the frontend appears "frozen" until another full reload/restart.
Runtime state is stored under:
hass.data[DOMAIN][entry_id] -> dict with known keys (see `data.py`)
"""
from collections.abc import Callable
from functools import cached_property
import logging
from typing import Any, cast
from homeassistant.components.sensor import RestoreSensor, SensorEntity
from py_typecheck import checked, checked_or
from homeassistant.components.sensor import SensorEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceEntryType
from homeassistant.helpers.entity import DeviceInfo, generate_entity_id
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from . import WeatherDataUpdateCoordinator
from . import health_sensor
from .const import (
CHILL_INDEX,
DOMAIN,
@ -23,121 +43,233 @@ from .const import (
WIND_SPEED,
WSLINK,
)
from .data import ENTRY_ADD_ENTITIES, ENTRY_COORDINATOR, ENTRY_DESCRIPTIONS
from .sensors_common import WeatherSensorEntityDescription
from .sensors_weather import SENSOR_TYPES_WEATHER_API
from .sensors_wslink import SENSOR_TYPES_WSLINK
from .utils import chill_index, heat_index
_LOGGER = logging.getLogger(__name__)
# The `async_add_entities` callback accepts a list of Entity-like objects.
# We keep the type loose here to avoid propagating HA generics (`DataUpdateCoordinator[T]`)
# that often end up as "partially unknown" under type-checkers.
_AddEntitiesFn = Callable[[list[SensorEntity]], None]
def _auto_enable_derived_sensors(requested: set[str]) -> set[str]:
"""Auto-enable derived sensors when their source fields are present.
This does NOT model strict dependencies ("if you want X, we force-add inputs").
Instead, it opportunistically enables derived outputs when the station already
provides the raw fields needed to compute them.
"""
expanded = set(requested)
# Wind azimut depends on wind dir
if WIND_DIR in expanded:
expanded.add(WIND_AZIMUT)
# Heat index depends on temp + humidity
if OUTSIDE_TEMP in expanded and OUTSIDE_HUMIDITY in expanded:
expanded.add(HEAT_INDEX)
# Chill index depends on temp + wind speed
if OUTSIDE_TEMP in expanded and WIND_SPEED in expanded:
expanded.add(CHILL_INDEX)
return expanded
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
async_add_entities: AddEntitiesCallback,
) -> None:
"""Set up Weather Station sensors."""
"""Set up Weather Station sensors.
coordinator: WeatherDataUpdateCoordinator = hass.data[DOMAIN][config_entry.entry_id]
We also store `async_add_entities` and a map of sensor descriptions in `hass.data`
so the webhook handler can add newly discovered entities dynamically without
reloading the config entry.
"""
sensors_to_load: list = []
sensors: list = []
_wslink = config_entry.data.get(WSLINK)
if (hass_data := checked(hass.data.setdefault(DOMAIN, {}), dict[str, Any])) is None:
return
SENSOR_TYPES = SENSOR_TYPES_WSLINK if _wslink else SENSOR_TYPES_WEATHER_API
# we have to check if entry_data are present
# It is created by integration setup, so it should be presnet
if (
entry_data := checked(hass_data.get(config_entry.entry_id), dict[str, Any])
) is None:
# This should not happen in normal operation.
return
# Check if we have some sensors to load.
if sensors_to_load := config_entry.options.get(SENSORS_TO_LOAD):
if WIND_DIR in sensors_to_load:
sensors_to_load.append(WIND_AZIMUT)
if (OUTSIDE_HUMIDITY in sensors_to_load) and (OUTSIDE_TEMP in sensors_to_load):
sensors_to_load.append(HEAT_INDEX)
coordinator = entry_data.get(ENTRY_COORDINATOR)
if coordinator is None:
# Coordinator is created by the integration (`__init__.py`). Without it, we cannot set up entities.
# This should not happen in normal operation; treat it as a no-op setup.
return
if (WIND_SPEED in sensors_to_load) and (OUTSIDE_TEMP in sensors_to_load):
sensors_to_load.append(CHILL_INDEX)
sensors = [
WeatherSensor(hass, description, coordinator)
for description in SENSOR_TYPES
if description.key in sensors_to_load
]
async_add_entities(sensors)
# Store the platform callback so we can add entities later (auto-discovery) without reload.
entry_data[ENTRY_ADD_ENTITIES] = async_add_entities
# Wire up the integration health diagnostic sensor.
# This is kept in a dedicated module (`health_sensor.py`) for readability.
await health_sensor.async_setup_entry(hass, config_entry, async_add_entities)
wslink_enabled = checked_or(config_entry.options.get(WSLINK), bool, False)
sensor_types = SENSOR_TYPES_WSLINK if wslink_enabled else SENSOR_TYPES_WEATHER_API
# Keep a descriptions map for dynamic entity creation by key.
# When the station starts sending a new payload field, the webhook handler can
# look up its description here and instantiate the matching entity.
entry_data[ENTRY_DESCRIPTIONS] = {desc.key: desc for desc in sensor_types}
sensors_to_load = checked_or(
config_entry.options.get(SENSORS_TO_LOAD), list[str], []
)
if not sensors_to_load:
return
requested = _auto_enable_derived_sensors(set(sensors_to_load))
entities: list[WeatherSensor] = [
WeatherSensor(description, coordinator)
for description in sensor_types
if description.key in requested
]
async_add_entities(entities)
class WeatherSensor(
CoordinatorEntity[WeatherDataUpdateCoordinator], RestoreSensor, SensorEntity
):
"""Implementation of Weather Sensor entity."""
def add_new_sensors(
hass: HomeAssistant, config_entry: ConfigEntry, keys: list[str]
) -> None:
"""Dynamically add newly discovered sensors without reloading the entry.
Called by the webhook handler when the station starts sending new fields.
Design notes:
- This function is intentionally a safe no-op if the sensor platform hasn't
finished setting up yet (e.g. callback/description map missing).
- Unknown payload keys are ignored (only keys with an entity description are added).
"""
if (hass_data := checked(hass.data.get(DOMAIN), dict[str, Any])) is None:
return
if (
entry_data := checked(hass_data.get(config_entry.entry_id), dict[str, Any])
) is None:
return
add_entities = entry_data.get(ENTRY_ADD_ENTITIES)
descriptions = entry_data.get(ENTRY_DESCRIPTIONS)
coordinator = entry_data.get(ENTRY_COORDINATOR)
if add_entities is None or descriptions is None or coordinator is None:
return
add_entities_fn = cast("_AddEntitiesFn", add_entities)
descriptions_map = cast("dict[str, WeatherSensorEntityDescription]", descriptions)
new_entities: list[SensorEntity] = []
for key in keys:
desc = descriptions_map.get(key)
if desc is None:
continue
new_entities.append(WeatherSensor(desc, coordinator))
if new_entities:
add_entities_fn(new_entities)
class WeatherSensor( # pyright: ignore[reportIncompatibleVariableOverride]
CoordinatorEntity, SensorEntity
): # pyright: ignore[reportIncompatibleVariableOverride]
"""Implementation of Weather Sensor entity.
We intentionally keep the coordinator type unparameterized here to avoid
propagating HA's generic `DataUpdateCoordinator[T]` typing into this module.
"""
_attr_has_entity_name = True
_attr_should_poll = False
def __init__(
self,
hass: HomeAssistant,
description: WeatherSensorEntityDescription,
coordinator: WeatherDataUpdateCoordinator,
coordinator: Any,
) -> None:
"""Initialize sensor."""
super().__init__(coordinator)
self.hass = hass
self.coordinator = coordinator
self.entity_description = description
self._attr_unique_id = description.key
self._data = None
async def async_added_to_hass(self) -> None:
"""Handle listeners to reloaded sensors."""
await super().async_added_to_hass()
self.coordinator.async_add_listener(self._handle_coordinator_update)
# prev_state_data = await self.async_get_last_sensor_data()
# prev_state = await self.async_get_last_state()
# if not prev_state:
# return
# self._data = prev_state_data.native_value
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
self._data = self.coordinator.data.get(self.entity_description.key)
super()._handle_coordinator_update()
self.async_write_ha_state()
config_entry = getattr(self.coordinator, "config", None)
self._dev_log = checked_or(
config_entry.options.get("dev_debug_checkbox")
if config_entry is not None
else False,
bool,
False,
)
@property
def native_value(self) -> str | int | float | None:
"""Return value of entity."""
def native_value(self): # pyright: ignore[reportIncompatibleVariableOverride]
"""Return the current sensor state.
_wslink = self.coordinator.config.options.get(WSLINK)
Resolution order:
1) If `value_from_data_fn` is provided, it receives the full payload dict and can compute
derived values (e.g. battery enum mapping, azimut text, heat/chill indices).
2) Otherwise we read the raw value for this key from the payload and pass it through `value_fn`.
if self.coordinator.data and (WIND_AZIMUT in self.entity_description.key):
return self.entity_description.value_fn(self.coordinator.data.get(WIND_DIR))
Payload normalization:
- The station sometimes sends empty strings for missing fields; we treat "" as no value (None).
"""
data: dict[str, Any] = checked_or(self.coordinator.data, dict[str, Any], {})
key = self.entity_description.key
if (
self.coordinator.data
and (HEAT_INDEX in self.entity_description.key)
and not _wslink
):
return self.entity_description.value_fn(heat_index(self.coordinator.data))
description = cast("WeatherSensorEntityDescription", self.entity_description)
if (
self.coordinator.data
and (CHILL_INDEX in self.entity_description.key)
and not _wslink
):
return self.entity_description.value_fn(chill_index(self.coordinator.data))
if description.value_from_data_fn is not None:
try:
value = description.value_from_data_fn(data)
except Exception: # noqa: BLE001
_LOGGER.exception(
"native_value compute failed via value_from_data_fn for key=%s", key
)
return None
return None if self._data == "" else self.entity_description.value_fn(self._data)
return value
raw = data.get(key)
if raw is None or raw == "":
if self._dev_log:
_LOGGER.debug("native_value missing raw: key=%s raw=%s", key, raw)
return None
if description.value_fn is None:
if self._dev_log:
_LOGGER.debug("native_value has no value_fn: key=%s raw=%s", key, raw)
return None
try:
value = description.value_fn(raw)
except Exception: # noqa: BLE001
_LOGGER.exception(
"native_value compute failed via value_fn for key=%s raw=%s", key, raw
)
return None
return value
@property
def suggested_entity_id(self) -> str:
"""Return name."""
return generate_entity_id("sensor.{}", self.entity_description.key)
@property
@cached_property
def device_info(self) -> DeviceInfo:
"""Device info."""
return DeviceInfo(

View File

@ -11,4 +11,7 @@ from homeassistant.components.sensor import SensorEntityDescription
class WeatherSensorEntityDescription(SensorEntityDescription):
"""Describe Weather Sensor entities."""
value_fn: Callable[[Any], int | float | str | None]
value_fn: Callable[[Any], int | float | str | None] | None = None
value_from_data_fn: Callable[[dict[str, Any]], int | float | str | None] | None = (
None
)

View File

@ -1,7 +1,5 @@
"""Sensor entities for the SWS12500 integration for old endpoint."""
from typing import cast
from homeassistant.components.sensor import SensorDeviceClass, SensorStateClass
from homeassistant.const import (
DEGREE,
@ -41,7 +39,7 @@ from .const import (
UnitOfDir,
)
from .sensors_common import WeatherSensorEntityDescription
from .utils import wind_dir_to_text
from .utils import chill_index, heat_index, to_float, to_int, wind_dir_to_text
SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
WeatherSensorEntityDescription(
@ -51,7 +49,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
icon="mdi:thermometer",
device_class=SensorDeviceClass.TEMPERATURE,
translation_key=INDOOR_TEMP,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=INDOOR_HUMIDITY,
@ -60,7 +58,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
icon="mdi:thermometer",
device_class=SensorDeviceClass.HUMIDITY,
translation_key=INDOOR_HUMIDITY,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=OUTSIDE_TEMP,
@ -69,7 +67,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
icon="mdi:thermometer",
device_class=SensorDeviceClass.TEMPERATURE,
translation_key=OUTSIDE_TEMP,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=OUTSIDE_HUMIDITY,
@ -78,7 +76,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
icon="mdi:thermometer",
device_class=SensorDeviceClass.HUMIDITY,
translation_key=OUTSIDE_HUMIDITY,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=DEW_POINT,
@ -87,7 +85,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
icon="mdi:thermometer-lines",
device_class=SensorDeviceClass.TEMPERATURE,
translation_key=DEW_POINT,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=BARO_PRESSURE,
@ -97,7 +95,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.ATMOSPHERIC_PRESSURE,
suggested_unit_of_measurement=UnitOfPressure.HPA,
translation_key=BARO_PRESSURE,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=WIND_SPEED,
@ -107,7 +105,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
suggested_unit_of_measurement=UnitOfSpeed.KILOMETERS_PER_HOUR,
icon="mdi:weather-windy",
translation_key=WIND_SPEED,
value_fn=lambda data: cast("int", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=WIND_GUST,
@ -117,23 +115,24 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
suggested_unit_of_measurement=UnitOfSpeed.KILOMETERS_PER_HOUR,
icon="mdi:windsock",
translation_key=WIND_GUST,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=WIND_DIR,
native_unit_of_measurement=DEGREE,
state_class=SensorStateClass.MEASUREMENT,
state_class=SensorStateClass.MEASUREMENT_ANGLE,
device_class=SensorDeviceClass.WIND_DIRECTION,
suggested_display_precision=None,
icon="mdi:sign-direction",
translation_key=WIND_DIR,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=WIND_AZIMUT,
icon="mdi:sign-direction",
value_fn=lambda data: cast("str", wind_dir_to_text(data)),
value_from_data_fn=lambda dir: wind_dir_to_text(dir.get(WIND_DIR, 0.0)),
device_class=SensorDeviceClass.ENUM,
options=list(UnitOfDir),
options=[e.value for e in UnitOfDir],
translation_key=WIND_AZIMUT,
),
WeatherSensorEntityDescription(
@ -145,7 +144,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
suggested_display_precision=2,
icon="mdi:weather-pouring",
translation_key=RAIN,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=DAILY_RAIN,
@ -156,7 +155,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
suggested_display_precision=2,
icon="mdi:weather-pouring",
translation_key=DAILY_RAIN,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=SOLAR_RADIATION,
@ -165,7 +164,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.IRRADIANCE,
icon="mdi:weather-sunny",
translation_key=SOLAR_RADIATION,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=UV,
@ -174,7 +173,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
native_unit_of_measurement=UV_INDEX,
icon="mdi:sunglasses",
translation_key=UV,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH2_TEMP,
@ -184,7 +183,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
icon="mdi:weather-sunny",
translation_key=CH2_TEMP,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH2_HUMIDITY,
@ -193,7 +192,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.HUMIDITY,
icon="mdi:weather-sunny",
translation_key=CH2_HUMIDITY,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH3_TEMP,
@ -203,7 +202,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
icon="mdi:weather-sunny",
translation_key=CH3_TEMP,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH3_HUMIDITY,
@ -212,7 +211,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.HUMIDITY,
icon="mdi:weather-sunny",
translation_key=CH3_HUMIDITY,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH4_TEMP,
@ -222,7 +221,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
icon="mdi:weather-sunny",
translation_key=CH4_TEMP,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH4_HUMIDITY,
@ -231,7 +230,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.HUMIDITY,
icon="mdi:weather-sunny",
translation_key=CH4_HUMIDITY,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=HEAT_INDEX,
@ -242,7 +241,8 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
suggested_display_precision=2,
icon="mdi:weather-sunny",
translation_key=HEAT_INDEX,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
value_from_data_fn=heat_index,
),
WeatherSensorEntityDescription(
key=CHILL_INDEX,
@ -253,6 +253,7 @@ SENSOR_TYPES_WEATHER_API: tuple[WeatherSensorEntityDescription, ...] = (
suggested_display_precision=2,
icon="mdi:weather-sunny",
translation_key=CHILL_INDEX,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
value_from_data_fn=chill_index,
),
)

View File

@ -1,7 +1,5 @@
"""Sensor entities for the SWS12500 integration for old endpoint."""
from typing import cast
from homeassistant.components.sensor import SensorDeviceClass, SensorStateClass
from homeassistant.const import (
DEGREE,
@ -17,35 +15,54 @@ from homeassistant.const import (
from .const import (
BARO_PRESSURE,
CH2_BATTERY,
CH2_HUMIDITY,
CH2_TEMP,
CH3_BATTERY,
CH3_HUMIDITY,
CH3_TEMP,
CH4_BATTERY,
CH4_HUMIDITY,
CH4_TEMP,
CH5_BATTERY,
CH5_HUMIDITY,
CH5_TEMP,
CH6_BATTERY,
CH6_HUMIDITY,
CH6_TEMP,
CH7_BATTERY,
CH7_HUMIDITY,
CH7_TEMP,
CH8_BATTERY,
CH8_HUMIDITY,
CH8_TEMP,
CHILL_INDEX,
DAILY_RAIN,
DEW_POINT,
HEAT_INDEX,
HOURLY_RAIN,
INDOOR_BATTERY,
INDOOR_HUMIDITY,
INDOOR_TEMP,
MONTHLY_RAIN,
OUTSIDE_BATTERY,
OUTSIDE_HUMIDITY,
OUTSIDE_TEMP,
RAIN,
SOLAR_RADIATION,
UV,
WBGT_TEMP,
WEEKLY_RAIN,
WIND_AZIMUT,
WIND_DIR,
WIND_GUST,
WIND_SPEED,
UnitOfDir,
MONTHLY_RAIN,
YEARLY_RAIN,
HOURLY_RAIN,
WEEKLY_RAIN,
UnitOfBat,
UnitOfDir,
)
from .sensors_common import WeatherSensorEntityDescription
from .utils import wind_dir_to_text
from .utils import battery_level, to_float, to_int, wind_dir_to_text
SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
WeatherSensorEntityDescription(
@ -55,7 +72,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
icon="mdi:thermometer",
device_class=SensorDeviceClass.TEMPERATURE,
translation_key=INDOOR_TEMP,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=INDOOR_HUMIDITY,
@ -64,7 +81,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
icon="mdi:thermometer",
device_class=SensorDeviceClass.HUMIDITY,
translation_key=INDOOR_HUMIDITY,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=OUTSIDE_TEMP,
@ -73,7 +90,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
icon="mdi:thermometer",
device_class=SensorDeviceClass.TEMPERATURE,
translation_key=OUTSIDE_TEMP,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=OUTSIDE_HUMIDITY,
@ -82,7 +99,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
icon="mdi:thermometer",
device_class=SensorDeviceClass.HUMIDITY,
translation_key=OUTSIDE_HUMIDITY,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=DEW_POINT,
@ -91,7 +108,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
icon="mdi:thermometer-lines",
device_class=SensorDeviceClass.TEMPERATURE,
translation_key=DEW_POINT,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=BARO_PRESSURE,
@ -101,7 +118,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.ATMOSPHERIC_PRESSURE,
suggested_unit_of_measurement=UnitOfPressure.HPA,
translation_key=BARO_PRESSURE,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=WIND_SPEED,
@ -111,7 +128,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
suggested_unit_of_measurement=UnitOfSpeed.KILOMETERS_PER_HOUR,
icon="mdi:weather-windy",
translation_key=WIND_SPEED,
value_fn=lambda data: cast("int", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=WIND_GUST,
@ -121,35 +138,36 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
suggested_unit_of_measurement=UnitOfSpeed.KILOMETERS_PER_HOUR,
icon="mdi:windsock",
translation_key=WIND_GUST,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=WIND_DIR,
native_unit_of_measurement=DEGREE,
state_class=SensorStateClass.MEASUREMENT,
state_class=SensorStateClass.MEASUREMENT_ANGLE,
device_class=SensorDeviceClass.WIND_DIRECTION,
suggested_display_precision=None,
icon="mdi:sign-direction",
translation_key=WIND_DIR,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=WIND_AZIMUT,
icon="mdi:sign-direction",
value_fn=lambda data: cast("str", wind_dir_to_text(data)),
value_from_data_fn=lambda dir: wind_dir_to_text(dir.get(WIND_DIR, 0.0)),
device_class=SensorDeviceClass.ENUM,
options=list(UnitOfDir),
options=[e.value for e in UnitOfDir],
translation_key=WIND_AZIMUT,
),
WeatherSensorEntityDescription(
key=RAIN,
native_unit_of_measurement=UnitOfVolumetricFlux.MILLIMETERS_PER_HOUR,
device_class=SensorDeviceClass.PRECIPITATION,
state_class=SensorStateClass.TOTAL,
device_class=SensorDeviceClass.PRECIPITATION_INTENSITY,
state_class=SensorStateClass.MEASUREMENT,
suggested_unit_of_measurement=UnitOfVolumetricFlux.MILLIMETERS_PER_HOUR,
suggested_display_precision=2,
icon="mdi:weather-pouring",
translation_key=RAIN,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=DAILY_RAIN,
@ -160,7 +178,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
suggested_display_precision=2,
icon="mdi:weather-pouring",
translation_key=DAILY_RAIN,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=HOURLY_RAIN,
@ -171,7 +189,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
suggested_display_precision=2,
icon="mdi:weather-pouring",
translation_key=HOURLY_RAIN,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=WEEKLY_RAIN,
@ -182,7 +200,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
suggested_display_precision=2,
icon="mdi:weather-pouring",
translation_key=WEEKLY_RAIN,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=MONTHLY_RAIN,
@ -193,7 +211,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
suggested_display_precision=2,
icon="mdi:weather-pouring",
translation_key=MONTHLY_RAIN,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=YEARLY_RAIN,
@ -204,7 +222,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
suggested_display_precision=2,
icon="mdi:weather-pouring",
translation_key=YEARLY_RAIN,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=SOLAR_RADIATION,
@ -213,7 +231,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.IRRADIANCE,
icon="mdi:weather-sunny",
translation_key=SOLAR_RADIATION,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=UV,
@ -222,7 +240,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
native_unit_of_measurement=UV_INDEX,
icon="mdi:sunglasses",
translation_key=UV,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH2_TEMP,
@ -232,7 +250,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
icon="mdi:weather-sunny",
translation_key=CH2_TEMP,
value_fn=lambda data: cast("float", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH2_HUMIDITY,
@ -241,46 +259,164 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.HUMIDITY,
icon="mdi:weather-sunny",
translation_key=CH2_HUMIDITY,
value_fn=lambda data: cast("int", data),
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH3_TEMP,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.TEMPERATURE,
suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
icon="mdi:weather-sunny",
translation_key=CH3_TEMP,
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH3_HUMIDITY,
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.HUMIDITY,
icon="mdi:weather-sunny",
translation_key=CH3_HUMIDITY,
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH4_TEMP,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.TEMPERATURE,
suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
icon="mdi:weather-sunny",
translation_key=CH4_TEMP,
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH4_HUMIDITY,
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.HUMIDITY,
icon="mdi:weather-sunny",
translation_key=CH4_HUMIDITY,
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH5_TEMP,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.TEMPERATURE,
suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
icon="mdi:weather-sunny",
translation_key=CH5_TEMP,
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH5_HUMIDITY,
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.HUMIDITY,
icon="mdi:weather-sunny",
translation_key=CH5_HUMIDITY,
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH6_TEMP,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.TEMPERATURE,
suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
icon="mdi:weather-sunny",
translation_key=CH6_TEMP,
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH6_HUMIDITY,
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.HUMIDITY,
icon="mdi:weather-sunny",
translation_key=CH6_HUMIDITY,
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH7_TEMP,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.TEMPERATURE,
suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
icon="mdi:weather-sunny",
translation_key=CH7_TEMP,
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH7_HUMIDITY,
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.HUMIDITY,
icon="mdi:weather-sunny",
translation_key=CH7_HUMIDITY,
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH8_TEMP,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.TEMPERATURE,
suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
icon="mdi:weather-sunny",
translation_key=CH8_TEMP,
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CH8_HUMIDITY,
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.HUMIDITY,
icon="mdi:weather-sunny",
translation_key=CH8_HUMIDITY,
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH3_BATTERY,
translation_key=CH3_BATTERY,
icon="mdi:battery-unknown",
device_class=SensorDeviceClass.ENUM,
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH4_BATTERY,
translation_key=CH4_BATTERY,
icon="mdi:battery-unknown",
device_class=SensorDeviceClass.ENUM,
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH5_BATTERY,
translation_key=CH5_BATTERY,
icon="mdi:battery-unknown",
device_class=SensorDeviceClass.ENUM,
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH6_BATTERY,
translation_key=CH6_BATTERY,
icon="mdi:battery-unknown",
device_class=SensorDeviceClass.ENUM,
value_fn=lambda data: data,
),
WeatherSensorEntityDescription(
key=CH7_BATTERY,
translation_key=CH7_BATTERY,
icon="mdi:battery-unknown",
device_class=SensorDeviceClass.ENUM,
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=CH8_BATTERY,
translation_key=CH8_BATTERY,
icon="mdi:battery-unknown",
device_class=SensorDeviceClass.ENUM,
value_fn=to_int,
),
# WeatherSensorEntityDescription(
# key=CH3_TEMP,
# native_unit_of_measurement=UnitOfTemperature.FAHRENHEIT,
# state_class=SensorStateClass.MEASUREMENT,
# device_class=SensorDeviceClass.TEMPERATURE,
# suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
# icon="mdi:weather-sunny",
# translation_key=CH3_TEMP,
# value_fn=lambda data: cast(float, data),
# ),
# WeatherSensorEntityDescription(
# key=CH3_HUMIDITY,
# native_unit_of_measurement=PERCENTAGE,
# state_class=SensorStateClass.MEASUREMENT,
# device_class=SensorDeviceClass.HUMIDITY,
# icon="mdi:weather-sunny",
# translation_key=CH3_HUMIDITY,
# value_fn=lambda data: cast(int, data),
# ),
# WeatherSensorEntityDescription(
# key=CH4_TEMP,
# native_unit_of_measurement=UnitOfTemperature.FAHRENHEIT,
# state_class=SensorStateClass.MEASUREMENT,
# device_class=SensorDeviceClass.TEMPERATURE,
# suggested_unit_of_measurement=UnitOfTemperature.CELSIUS,
# icon="mdi:weather-sunny",
# translation_key=CH4_TEMP,
# value_fn=lambda data: cast(float, data),
# ),
# WeatherSensorEntityDescription(
# key=CH4_HUMIDITY,
# native_unit_of_measurement=PERCENTAGE,
# state_class=SensorStateClass.MEASUREMENT,
# device_class=SensorDeviceClass.HUMIDITY,
# icon="mdi:weather-sunny",
# translation_key=CH4_HUMIDITY,
# value_fn=lambda data: cast(int, data),
# ),
WeatherSensorEntityDescription(
key=HEAT_INDEX,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
@ -290,7 +426,7 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
suggested_display_precision=2,
icon="mdi:weather-sunny",
translation_key=HEAT_INDEX,
value_fn=lambda data: cast("int", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=CHILL_INDEX,
@ -301,6 +437,42 @@ SENSOR_TYPES_WSLINK: tuple[WeatherSensorEntityDescription, ...] = (
suggested_display_precision=2,
icon="mdi:weather-sunny",
translation_key=CHILL_INDEX,
value_fn=lambda data: cast("int", data),
value_fn=to_float,
),
WeatherSensorEntityDescription(
key=OUTSIDE_BATTERY,
translation_key=OUTSIDE_BATTERY,
device_class=SensorDeviceClass.ENUM,
options=[e.value for e in UnitOfBat],
value_fn=None,
value_from_data_fn=lambda data: (
battery_level(data.get(OUTSIDE_BATTERY, None)).value
),
),
WeatherSensorEntityDescription(
key=CH2_BATTERY,
translation_key=CH2_BATTERY,
device_class=SensorDeviceClass.ENUM,
options=[e.value for e in UnitOfBat],
value_fn=None,
value_from_data_fn=lambda data: (
battery_level(data.get(CH2_BATTERY, None)).value
),
),
WeatherSensorEntityDescription(
key=INDOOR_BATTERY,
translation_key=INDOOR_BATTERY,
device_class=SensorDeviceClass.BATTERY,
value_fn=to_int,
),
WeatherSensorEntityDescription(
key=WBGT_TEMP,
translation_key=WBGT_TEMP,
icon="mdi:thermometer",
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.TEMPERATURE,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
suggested_display_precision=2,
value_fn=to_float,
),
)

View File

@ -5,7 +5,6 @@
"valid_credentials_key": "Provide valid API KEY.",
"valid_credentials_match": "API ID and API KEY should not be the same."
},
"step": {
"user": {
"description": "Provide API ID and API KEY so the Weather Station can access HomeAssistant",
@ -25,7 +24,6 @@
}
}
},
"options": {
"error": {
"valid_credentials_api": "Provide valid API ID.",
@ -33,7 +31,6 @@
"valid_credentials_match": "API ID and API KEY should not be the same.",
"windy_key_required": "Windy API key is required if you want to enable this function."
},
"step": {
"init": {
"title": "Configure SWS12500 Integration",
@ -43,7 +40,6 @@
"windy": "Windy configuration"
}
},
"basic": {
"description": "Provide API ID and API KEY so the Weather Station can access HomeAssistant",
"title": "Configure credentials",
@ -60,7 +56,6 @@
"WSLINK": "Enable WSLink API if the station is set to send data via WSLink."
}
},
"windy": {
"description": "Resend weather data to your Windy stations.",
"title": "Configure Windy",
@ -74,6 +69,36 @@
"windy_logger_checkbox": "Enable only if you want to send debuging data to the developer."
}
},
"pocasi": {
"description": "Resend data to Pocasi Meteo CZ",
"title": "Configure Pocasi Meteo CZ",
"data": {
"POCASI_CZ_API_ID": "ID from your Pocasi Meteo APP",
"POCASI_CZ_API_KEY": "Key from your Pocasi Meteo APP",
"POCASI_CZ_SEND_INTERVAL": "Resend interval in seconds",
"pocasi_enabled_checkbox": "Enable resending data to Pocasi Meteo",
"pocasi_logger_checkbox": "Log data and responses"
},
"data_description": {
"POCASI_CZ_API_ID": "You can obtain your ID in Pocasi Meteo App",
"POCASI_CZ_API_KEY": "You can obtain your KEY in Pocasi Meteo App",
"POCASI_CZ_SEND_INTERVAL": "Resend interval in seconds (minimum 12s, default 30s)",
"pocasi_enabled_checkbox": "Enables resending data to Pocasi Meteo",
"pocasi_logger_checkbox": "Enable only if you want to send debbug data to the developer"
}
},
"ecowitt": {
"description": "Nastavení pro Ecowitt",
"title": "Konfigurace pro stanice Ecowitt",
"data": {
"ecowitt_webhook_id": "Unikátní webhook ID",
"ecowitt_enabled": "Povolit data ze stanice Ecowitt"
},
"data_description": {
"ecowitt_webhook_id": "Nastavení pro stanici: {url}:{port}/weatherhub/{webhook_id}",
"ecowitt_enabled": "Povolit přijímání dat ze stanic Ecowitt"
}
},
"migration": {
"title": "Statistic migration.",
"description": "For the correct functioning of long-term statistics, it is necessary to migrate the sensor unit in the long-term statistics. The original unit of long-term statistics for daily precipitation was in mm/d, however, the station only sends data in mm without time differentiation.\n\n The sensor to be migrated is for daily precipitation. If the correct value is already in the list for the daily precipitation sensor (mm), then the migration is already complete.\n\n Migration result for the sensor: {migration_status}, a total of {migration_count} rows converted.",
@ -88,30 +113,86 @@
}
}
},
"entity": {
"sensor": {
"indoor_temp": { "name": "Indoor temperature" },
"indoor_humidity": { "name": "Indoor humidity" },
"outside_temp": { "name": "Outside Temperature" },
"outside_humidity": { "name": "Outside humidity" },
"uv": { "name": "UV index" },
"baro_pressure": { "name": "Barometric pressure" },
"dew_point": { "name": "Dew point" },
"wind_speed": { "name": "Wind speed" },
"wind_dir": { "name": "Wind direction" },
"wind_gust": { "name": "Wind gust" },
"rain": { "name": "Rain" },
"daily_rain": { "name": "Daily precipitation" },
"solar_radiation": { "name": "Solar irradiance" },
"ch2_temp": { "name": "Channel 2 temperature" },
"ch2_humidity": { "name": "Channel 2 humidity" },
"ch3_temp": { "name": "Channel 3 temperature" },
"ch3_humidity": { "name": "Channel 3 humidity" },
"ch4_temp": { "name": "Channel 4 temperature" },
"ch4_humidity": { "name": "Channel 4 humidity" },
"heat_index": { "name": "Apparent temperature" },
"chill_index": { "name": "Wind chill" },
"indoor_temp": {
"name": "Indoor temperature"
},
"indoor_humidity": {
"name": "Indoor humidity"
},
"outside_temp": {
"name": "Outside Temperature"
},
"outside_humidity": {
"name": "Outside humidity"
},
"uv": {
"name": "UV index"
},
"baro_pressure": {
"name": "Barometric pressure"
},
"dew_point": {
"name": "Dew point"
},
"wind_speed": {
"name": "Wind speed"
},
"wind_dir": {
"name": "Wind direction"
},
"wind_gust": {
"name": "Wind gust"
},
"rain": {
"name": "Rain"
},
"daily_rain": {
"name": "Daily precipitation"
},
"solar_radiation": {
"name": "Solar irradiance"
},
"ch2_temp": {
"name": "Channel 2 temperature"
},
"ch2_humidity": {
"name": "Channel 2 humidity"
},
"ch3_temp": {
"name": "Channel 3 temperature"
},
"ch3_humidity": {
"name": "Channel 3 humidity"
},
"ch4_temp": {
"name": "Channel 4 temperature"
},
"ch4_humidity": {
"name": "Channel 4 humidity"
},
"heat_index": {
"name": "Apparent temperature"
},
"chill_index": {
"name": "Wind chill"
},
"hourly_rain": {
"name": "Hourly precipitation"
},
"weekly_rain": {
"name": "Weekly precipitation"
},
"monthly_rain": {
"name": "Monthly precipitation"
},
"yearly_rain": {
"name": "Yearly precipitation"
},
"wbgt_index": {
"name": "WBGT index"
},
"wind_azimut": {
"name": "Bearing",
"state": {
@ -132,6 +213,30 @@
"nw": "NW",
"nnw": "NNW"
}
},
"outside_battery": {
"name": "Outside battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
},
"ch2_battery": {
"name": "Channel 2 battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
},
"indoor_battery": {
"name": "Console battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
}
}
},

View File

@ -0,0 +1 @@
../dev/custom_components/sws12500

View File

@ -24,15 +24,17 @@
}
}
},
"options": {
"error": {
"valid_credentials_api": "Vyplňte platné API ID",
"valid_credentials_key": "Vyplňte platný API KEY",
"valid_credentials_match": "API ID a API KEY nesmějí být stejné!",
"windy_key_required": "Je vyžadován Windy API key, pokud chcete aktivovat přeposílání dat na Windy"
"windy_id_required": "Je vyžadováno Windy ID, pokud chcete aktivovat přeposílání dat na Windy",
"windy_pw_required": "Je vyžadován Windy KEY, pokud chcete aktivovat přeposílání dat na Windy",
"pocasi_id_required": "Je vyžadován Počasí ID, pokud chcete aktivovat přeposílání dat na Počasí Meteo CZ",
"pocasi_key_required": "Klíč k účtu Počasí Meteo je povinný.",
"pocasi_send_minimum": "Minimální interval pro přeposílání je 12 sekund."
},
"step": {
"init": {
"title": "Nastavení integrace SWS12500",
@ -40,10 +42,12 @@
"menu_options": {
"basic": "Základní - přístupové údaje (přihlášení)",
"windy": "Nastavení pro přeposílání dat na Windy",
"pocasi": "Nastavení pro přeposlání dat na Počasí Meteo CZ",
"ecowitt": "Nastavení pro stanice Ecowitt",
"wslink_port_setup": "Nastavení portu WSLink Addonu",
"migration": "Migrace statistiky senzoru"
}
},
"basic": {
"description": "Zadejte API ID a API KEY, aby meteostanice mohla komunikovat s HomeAssistantem",
"title": "Nastavení přihlášení",
@ -60,20 +64,61 @@
"wslink": "WSLink API zapněte, pokud je stanice nastavena na zasílání dat přes WSLink."
}
},
"windy": {
"description": "Přeposílání dat z metostanice na Windy",
"title": "Konfigurace Windy",
"data": {
"WINDY_API_KEY": "Klíč API KEY získaný z Windy",
"WINDY_STATION_ID": "ID stanice, získaný z Windy",
"WINDY_STATION_PWD": "Heslo stanice, získané z Windy",
"windy_enabled_checkbox": "Povolit přeposílání dat na Windy",
"windy_logger_checkbox": "Logovat data a odpovědi z Windy"
},
"data_description": {
"WINDY_API_KEY": "Klíč API KEY získaný z https://https://api.windy.com/keys",
"WINDY_STATION_ID": "ID stanice získaný z https://stations.windy.com/station",
"WINDY_STATION_PWD": "Heslo stanice získané z https://stations.windy.com/station",
"windy_logger_checkbox": "Zapnout pouze v případě, že chcete poslat ladící informace vývojáři."
}
},
"pocasi": {
"description": "Přeposílání dat do aplikace Počasí Meteo",
"title": "Konfigurace Počasí Meteo",
"data": {
"POCASI_CZ_API_ID": "ID účtu na Počasí Meteo",
"POCASI_CZ_API_KEY": "Klíč (Key) k účtu Počasí Meteo",
"POCASI_CZ_SEND_INTERVAL": "Interval v sekundách",
"pocasi_enabled_chcekbox": "Povolit přeposílání dat na server Počasí Meteo",
"pocasi_logger_checkbox": "Logovat data a odpovědi z Počasí Meteo"
},
"data_description": {
"POCASI_API_ID": "ID získáte ve své aplikaci Počasí Meteo",
"POCASI_API_KEY": "Klíč (Key) získáte ve své aplikaci Počasí Meteo",
"POCASI_SEND_INTERVAL": "Interval v jakém se mají data na server přeposílat (minimum 12s, defaultně 30s)",
"pocasi_enabled_checkbox": "Zapne přeposílání data na server Počasí Meteo",
"pocasi_logger_checkbox": "Zapnout pouze v případě, že chcete zaslat ladící informace vývojáři."
}
},
"ecowitt": {
"description": "Nastavení pro Ecowitt",
"title": "Konfigurace pro stanice Ecowitt",
"data": {
"ecowitt_webhook_id": "Unikátní webhook ID",
"ecowitt_enabled": "Povolit data ze stanice Ecowitt"
},
"data_description": {
"ecowitt_webhook_id": "Nastavení pro stanici: {url}:{port}/weatherhub/{webhook_id}",
"ecowitt_enabled": "Povolit přijímání dat ze stanic Ecowitt"
}
},
"wslink_port_setup": {
"description": "Nastavení portu, kde naslouchá WSLink Addon. Slouží pro příjem diagnostik.",
"title": "Port WSLink Addonu",
"data": {
"WSLINK_ADDON_PORT": "Naslouchající port WSLink Addonu"
},
"data_description": {
"WSLINK_ADDON_PORT": "Zadejte port, tak jak jej máte nastavený ve WSLink Addonu."
}
},
"migration": {
"title": "Migrace statistiky senzoru.",
"description": "Pro správnou funkci dlouhodobé statistiky je nutné provést migraci jednotky senzoru v dlouhodobé statistice. Původní jednotka dlouhodobé statistiky pro denní úhrn srážek byla v mm/d, nicméně stanice zasílá pouze data v mm bez časového rozlišení.\n\n Senzor, který má být migrován je pro denní úhrn srážek. Pokud je v seznamu již správná hodnota u senzoru pro denní úhrn (mm), pak je již migrace hotová.\n\n Výsledek migrace pro senzor: {migration_status}, přepvedeno celkem {migration_count} řádků.",
@ -88,34 +133,181 @@
}
}
},
"entity": {
"sensor": {
"indoor_temp": { "name": "Vnitřní teplota" },
"indoor_humidity": { "name": "Vnitřní vlhkost vzduchu" },
"outside_temp": { "name": "Venkovní teplota" },
"outside_humidity": { "name": "Venkovní vlhkost vzduchu" },
"uv": { "name": "UV index" },
"baro_pressure": { "name": "Tlak vzduchu" },
"dew_point": { "name": "Rosný bod" },
"wind_speed": { "name": "Rychlost větru" },
"wind_dir": { "name": "Směr větru" },
"wind_gust": { "name": "Poryvy větru" },
"rain": { "name": "Srážky" },
"daily_rain": { "name": "Denní úhrn srážek" },
"solar_radiation": { "name": "Sluneční osvit" },
"ch2_temp": { "name": "Teplota senzoru 2" },
"ch2_humidity": { "name": "Vlhkost sensoru 2" },
"ch3_temp": { "name": "Teplota senzoru 3" },
"ch3_humidity": { "name": "Vlhkost sensoru 3" },
"ch4_temp": { "name": "Teplota senzoru 4" },
"ch4_humidity": { "name": "Vlhkost sensoru 4" },
"heat_index": { "name": "Tepelný index" },
"chill_index": { "name": "Pocitová teplota" },
"hourly_rain": { "name": "Hodinový úhrn srážek" },
"weekly_rain": { "name": "Týdenní úhrn srážek" },
"monthly_rain": { "name": "Měsíční úhrn srážek" },
"yearly_rain": { "name": "Roční úhrn srážek" },
"integration_health": {
"name": "Stav integrace",
"state": {
"online_wu": "Online PWS/WU",
"online_wslink": "Online WSLink",
"online_idle": "Čekám na data",
"degraded": "Degradovaný",
"error": "Nefunkční"
}
},
"active_protocol": {
"name": "Aktivní protokol",
"state": {
"wu": "PWS/WU",
"wslink": "WSLink API"
}
},
"wslink_addon_status": {
"name": "Stav WSLink Addonu",
"state": {
"online": "Běží",
"offline": "Vypnutý"
}
},
"wslink_addon_name": {
"name": "Název WSLink Addonu"
},
"wslink_addon_version": {
"name": "Verze WSLink Addonu"
},
"wslink_addon_listen_port": {
"name": "Port WSLink Addonu"
},
"wslink_upstream_ha_port": {
"name": "Port upstream HA WSLink Addonu"
},
"route_wu_enabled": {
"name": "Protokol PWS/WU"
},
"route_wslink_enabled": {
"name": "Protokol WSLink"
},
"last_ingress_time": {
"name": "Poslední přístup"
},
"last_ingress_protocol": {
"name": "Protokol posledního přístupu",
"state": {
"wu": "PWS/WU",
"wslink": "WSLink API"
}
},
"last_ingress_route_enabled": {
"name": "Trasa posledního přístupu povolena"
},
"last_ingress_accepted": {
"name": "Poslední přístup",
"state": {
"accepted": "Prijat",
"rejected": "Odmítnut"
}
},
"last_ingress_authorized": {
"name": "Autorizace posledního přístupu",
"state": {
"authorized": "Autorizován",
"unauthorized": "Neautorizován",
"unknown": "Neznámý"
}
},
"last_ingress_reason": {
"name": "Zpráva přístupu"
},
"forward_windy_enabled": {
"name": "Přeposílání na Windy"
},
"forward_windy_status": {
"name": "Stav přeposílání na Windy",
"state": {
"disabled": "Vypnuto",
"idle": "Čekám na odeslání",
"ok": "Ok"
}
},
"forward_pocasi_enabled": {
"name": "Přeposílání na Počasí Meteo"
},
"forward_pocasi_status": {
"name": "Stav přeposílání na Počasí Meteo",
"state": {
"disabled": "Vypnuto",
"idle": "Čekám na odeslání",
"ok": "Ok"
}
},
"indoor_temp": {
"name": "Vnitřní teplota"
},
"indoor_humidity": {
"name": "Vnitřní vlhkost vzduchu"
},
"outside_temp": {
"name": "Venkovní teplota"
},
"outside_humidity": {
"name": "Venkovní vlhkost vzduchu"
},
"uv": {
"name": "UV index"
},
"baro_pressure": {
"name": "Tlak vzduchu"
},
"dew_point": {
"name": "Rosný bod"
},
"wind_speed": {
"name": "Rychlost větru"
},
"wind_dir": {
"name": "Směr větru"
},
"wind_gust": {
"name": "Poryvy větru"
},
"rain": {
"name": "Srážky"
},
"daily_rain": {
"name": "Denní úhrn srážek"
},
"solar_radiation": {
"name": "Sluneční osvit"
},
"ch2_temp": {
"name": "Teplota senzoru 2"
},
"ch2_humidity": {
"name": "Vlhkost sensoru 2"
},
"ch3_temp": {
"name": "Teplota senzoru 3"
},
"ch3_humidity": {
"name": "Vlhkost sensoru 3"
},
"ch4_temp": {
"name": "Teplota senzoru 4"
},
"ch4_humidity": {
"name": "Vlhkost sensoru 4"
},
"heat_index": {
"name": "Tepelný index"
},
"chill_index": {
"name": "Pocitová teplota"
},
"hourly_rain": {
"name": "Hodinový úhrn srážek"
},
"weekly_rain": {
"name": "Týdenní úhrn srážek"
},
"monthly_rain": {
"name": "Měsíční úhrn srážek"
},
"yearly_rain": {
"name": "Roční úhrn srážek"
},
"wbgt_temp": {
"name": "WBGT index"
},
"wind_azimut": {
"name": "Azimut",
"state": {
@ -136,6 +328,30 @@
"nw": "SZ",
"nnw": "SSZ"
}
},
"outside_battery": {
"name": "Stav nabití venkovní baterie",
"state": {
"low": "Nízká",
"normal": "Normální",
"unknown": "Neznámá / zcela vybitá"
}
},
"indoor_battery": {
"name": "Stav nabití baterie kozole",
"state": {
"low": "Nízká",
"normal": "Normální",
"drained": "Neznámá / zcela vybitá"
}
},
"ch2_battery": {
"name": "Stav nabití baterie kanálu 2",
"state": {
"low": "Nízká",
"normal": "Normální",
"unknown": "Neznámá / zcela vybitá"
}
}
}
},

View File

@ -5,7 +5,6 @@
"valid_credentials_key": "Provide valid API KEY.",
"valid_credentials_match": "API ID and API KEY should not be the same."
},
"step": {
"user": {
"description": "Provide API ID and API KEY so the Weather Station can access HomeAssistant",
@ -25,15 +24,14 @@
}
}
},
"options": {
"error": {
"valid_credentials_api": "Provide valid API ID.",
"valid_credentials_key": "Provide valid API KEY.",
"valid_credentials_match": "API ID and API KEY should not be the same.",
"windy_key_required": "Windy API key is required if you want to enable this function."
"windy_id_required": "Windy API ID is required if you want to enable this function.",
"windy_pw_required": "Windy API password is required if you want to enable this function."
},
"step": {
"init": {
"title": "Configure SWS12500 Integration",
@ -43,7 +41,6 @@
"windy": "Windy configuration"
}
},
"basic": {
"description": "Provide API ID and API KEY so the Weather Station can access HomeAssistant",
"title": "Configure credentials",
@ -60,20 +57,51 @@
"WSLINK": "Enable WSLink API if the station is set to send data via WSLink."
}
},
"windy": {
"description": "Resend weather data to your Windy stations.",
"title": "Configure Windy",
"data": {
"WINDY_API_KEY": "API KEY provided by Windy",
"WINDY_STATION_ID": "Station ID obtained form Windy",
"WINDY_STATION_PWD": "Station password obtained from Windy",
"windy_enabled_checkbox": "Enable resending data to Windy",
"windy_logger_checkbox": "Log Windy data and responses"
},
"data_description": {
"WINDY_API_KEY": "Windy API KEY obtained from https://https://api.windy.com/keys",
"WINDY_STATION_ID": "Windy station ID obtained from https://stations.windy.com/stations",
"WINDY_STATION_PWD": "Windy station password obtained from https://stations.windy.com/stations",
"windy_logger_checkbox": "Enable only if you want to send debuging data to the developer."
}
},
"pocasi": {
"description": "Resend data to Pocasi Meteo CZ",
"title": "Configure Pocasi Meteo CZ",
"data": {
"POCASI_CZ_API_ID": "ID from your Pocasi Meteo APP",
"POCASI_CZ_API_KEY": "Key from your Pocasi Meteo APP",
"POCASI_CZ_SEND_INTERVAL": "Resend interval in seconds",
"pocasi_enabled_checkbox": "Enable resending data to Pocasi Meteo",
"pocasi_logger_checkbox": "Log data and responses"
},
"data_description": {
"POCASI_CZ_API_ID": "You can obtain your ID in Pocasi Meteo App",
"POCASI_CZ_API_KEY": "You can obtain your KEY in Pocasi Meteo App",
"POCASI_CZ_SEND_INTERVAL": "Resend interval in seconds (minimum 12s, default 30s)",
"pocasi_enabled_checkbox": "Enables resending data to Pocasi Meteo",
"pocasi_logger_checkbox": "Enable only if you want to send debbug data to the developer"
}
},
"ecowitt": {
"description": "Nastavení pro Ecowitt",
"title": "Konfigurace pro stanice Ecowitt",
"data": {
"ecowitt_webhook_id": "Unikátní webhook ID",
"ecowitt_enabled": "Povolit data ze stanice Ecowitt"
},
"data_description": {
"ecowitt_webhook_id": "Nastavení pro stanici: {url}:{port}/weatherhub/{webhook_id}",
"ecowitt_enabled": "Povolit přijímání dat ze stanic Ecowitt"
}
},
"migration": {
"title": "Statistic migration.",
"description": "For the correct functioning of long-term statistics, it is necessary to migrate the sensor unit in the long-term statistics. The original unit of long-term statistics for daily precipitation was in mm/d, however, the station only sends data in mm without time differentiation.\n\n The sensor to be migrated is for daily precipitation. If the correct value is already in the list for the daily precipitation sensor (mm), then the migration is already complete.\n\n Migration result for the sensor: {migration_status}, a total of {migration_count} rows converted.",
@ -88,34 +116,205 @@
}
}
},
"entity": {
"sensor": {
"indoor_temp": { "name": "Indoor temperature" },
"indoor_humidity": { "name": "Indoor humidity" },
"outside_temp": { "name": "Outside Temperature" },
"outside_humidity": { "name": "Outside humidity" },
"uv": { "name": "UV index" },
"baro_pressure": { "name": "Barometric pressure" },
"dew_point": { "name": "Dew point" },
"wind_speed": { "name": "Wind speed" },
"wind_dir": { "name": "Wind direction" },
"wind_gust": { "name": "Wind gust" },
"rain": { "name": "Rain" },
"daily_rain": { "name": "Daily precipitation" },
"solar_radiation": { "name": "Solar irradiance" },
"ch2_temp": { "name": "Channel 2 temperature" },
"ch2_humidity": { "name": "Channel 2 humidity" },
"ch3_temp": { "name": "Channel 3 temperature" },
"ch3_humidity": { "name": "Channel 3 humidity" },
"ch4_temp": { "name": "Channel 4 temperature" },
"ch4_humidity": { "name": "Channel 4 humidity" },
"heat_index": { "name": "Apparent temperature" },
"chill_index": { "name": "Wind chill" },
"hourly_rain": { "name": "Hourly precipitation" },
"weekly_rain": { "name": "Weekly precipitation" },
"monthly_rain": { "name": "Monthly precipitation" },
"yearly_rain": { "name": "Yearly precipitation" },
"integration_health": {
"name": "Integration status",
"state": {
"online_wu": "Online PWS/WU",
"online_wslink": "Online WSLink",
"online_idle": "Waiting for data",
"degraded": "Degraded",
"error": "Error"
}
},
"active_protocol": {
"name": "Active protocol",
"state": {
"wu": "PWS/WU",
"wslink": "WSLink API"
}
},
"wslink_addon_status": {
"name": "WSLink Addon Status",
"state": {
"online": "Running",
"offline": "Offline"
}
},
"wslink_addon_name": {
"name": "WSLink Addon Name"
},
"wslink_addon_version": {
"name": "WSLink Addon Version"
},
"wslink_addon_listen_port": {
"name": "WSLink Addon Listen Port"
},
"wslink_upstream_ha_port": {
"name": "WSLink Addon Upstream HA Port"
},
"route_wu_enabled": {
"name": "PWS/WU Protocol"
},
"route_wslink_enabled": {
"name": "WSLink Protocol"
},
"last_ingress_time": {
"name": "Last access time"
},
"last_ingress_protocol": {
"name": "Last access protocol",
"state": {
"wu": "PWS/WU",
"wslink": "WSLink API"
}
},
"last_ingress_route_enabled": {
"name": "Last ingress route enabled"
},
"last_ingress_accepted": {
"name": "Last access",
"state": {
"accepted": "Accepted",
"rejected": "Rejected"
}
},
"last_ingress_authorized": {
"name": "Last access authorization",
"state": {
"authorized": "Authorized",
"unauthorized": "Unauthorized",
"unknown": "Unknown"
}
},
"last_ingress_reason": {
"name": "Last access reason"
},
"forward_windy_enabled": {
"name": "Forwarding to Windy"
},
"forward_windy_status": {
"name": "Forwarding status to Windy",
"state": {
"disabled": "Disabled",
"idle": "Waiting to send",
"ok": "Ok"
}
},
"forward_pocasi_enabled": {
"name": "Forwarding to Počasí Meteo"
},
"forward_pocasi_status": {
"name": "Forwarding status to Počasí Meteo",
"state": {
"disabled": "Disabled",
"idle": "Waiting to send",
"ok": "Ok"
}
},
"indoor_temp": {
"name": "Indoor temperature"
},
"indoor_humidity": {
"name": "Indoor humidity"
},
"outside_temp": {
"name": "Outside Temperature"
},
"outside_humidity": {
"name": "Outside humidity"
},
"uv": {
"name": "UV index"
},
"baro_pressure": {
"name": "Barometric pressure"
},
"dew_point": {
"name": "Dew point"
},
"wind_speed": {
"name": "Wind speed"
},
"wind_dir": {
"name": "Wind direction"
},
"wind_gust": {
"name": "Wind gust"
},
"rain": {
"name": "Rain"
},
"daily_rain": {
"name": "Daily precipitation"
},
"solar_radiation": {
"name": "Solar irradiance"
},
"ch2_temp": {
"name": "Channel 2 temperature"
},
"ch2_humidity": {
"name": "Channel 2 humidity"
},
"ch3_temp": {
"name": "Channel 3 temperature"
},
"ch3_humidity": {
"name": "Channel 3 humidity"
},
"ch4_temp": {
"name": "Channel 4 temperature"
},
"ch4_humidity": {
"name": "Channel 4 humidity"
},
"ch5_temp": {
"name": "Channel 5 temperature"
},
"ch5_humidity": {
"name": "Channel 5 humidity"
},
"ch6_temp": {
"name": "Channel 6 temperature"
},
"ch6_humidity": {
"name": "Channel 6 humidity"
},
"ch7_temp": {
"name": "Channel 7 temperature"
},
"ch7_humidity": {
"name": "Channel 7 humidity"
},
"ch8_temp": {
"name": "Channel 8 temperature"
},
"ch8_humidity": {
"name": "Channel 8 humidity"
},
"heat_index": {
"name": "Apparent temperature"
},
"chill_index": {
"name": "Wind chill"
},
"hourly_rain": {
"name": "Hourly precipitation"
},
"weekly_rain": {
"name": "Weekly precipitation"
},
"monthly_rain": {
"name": "Monthly precipitation"
},
"yearly_rain": {
"name": "Yearly precipitation"
},
"wbgt_index": {
"name": "WBGT index"
},
"wind_azimut": {
"name": "Bearing",
"state": {
@ -136,6 +335,78 @@
"nw": "NW",
"nnw": "NNW"
}
},
"outside_battery": {
"name": "Outside battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
},
"ch2_battery": {
"name": "Channel 2 battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
},
"ch3_battery": {
"name": "Channel 3 battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
},
"ch4_battery": {
"name": "Channel 4 battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
},
"ch5_battery": {
"name": "Channel 5 battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
},
"ch6_battery": {
"name": "Channel 6 battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
},
"ch7_battery": {
"name": "Channel 7 battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
},
"ch8_battery": {
"name": "Channel 8 battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
},
"indoor_battery": {
"name": "Console battery level",
"state": {
"normal": "OK",
"low": "Low",
"unknown": "Unknown / drained out"
}
}
}
},

View File

@ -1,26 +1,29 @@
"""Utils for SWS12500."""
"""Utils for SWS12500.
This module contains small helpers used across the integration.
Notable responsibilities:
- Payload remapping: convert raw station/webhook field names into stable internal keys.
- Auto-discovery helpers: detect new payload fields that are not enabled yet and persist them
to config entry options so sensors can be created dynamically.
- Formatting/conversion helpers (wind direction text, battery mapping, temperature conversions).
Keeping these concerns in one place avoids duplicating logic in the webhook handler and entity code.
"""
import logging
import math
from pathlib import Path
import sqlite3
from typing import Any
from typing import Any, cast
import numpy as np
from py_typecheck.core import checked_or
from homeassistant.components import persistent_notification
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
UnitOfPrecipitationDepth,
UnitOfTemperature,
UnitOfVolumetricFlux,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.translation import async_get_translations
from .const import (
AZIMUT,
DATABASE_PATH,
DEV_DBG,
OUTSIDE_HUMIDITY,
OUTSIDE_TEMP,
@ -28,6 +31,7 @@ from .const import (
REMAP_WSLINK_ITEMS,
SENSORS_TO_LOAD,
WIND_SPEED,
UnitOfBat,
UnitOfDir,
)
@ -41,12 +45,12 @@ async def translations(
*,
key: str = "message",
category: str = "notify",
) -> str:
) -> str | None:
"""Get translated keys for domain."""
localize_key = f"component.{translation_domain}.{category}.{translation_key}.{key}"
language = hass.config.language
language: str = hass.config.language
_translations = await async_get_translations(
hass, language, category, [translation_domain]
@ -65,7 +69,7 @@ async def translated_notification(
*,
key: str = "message",
category: str = "notify",
) -> str:
):
"""Translate notification."""
localize_key = f"component.{translation_domain}.{category}.{translation_key}.{key}"
@ -74,7 +78,7 @@ async def translated_notification(
f"component.{translation_domain}.{category}.{translation_key}.title"
)
language = hass.config.language
language: str = cast("str", hass.config.language)
_translations = await async_get_translations(
hass, language, category, [translation_domain]
@ -95,8 +99,11 @@ async def translated_notification(
async def update_options(
hass: HomeAssistant, entry: ConfigEntry, update_key, update_value
) -> None:
hass: HomeAssistant,
entry: ConfigEntry,
update_key: str,
update_value: str | list[str] | bool,
) -> bool:
"""Update config.options entry."""
conf = {**entry.options}
conf[update_key] = update_value
@ -104,57 +111,79 @@ async def update_options(
return hass.config_entries.async_update_entry(entry, options=conf)
def anonymize(data):
"""Anoynimize recieved data."""
def anonymize(
data: dict[str, str | int | float | bool],
) -> dict[str, str | int | float | bool]:
"""Anonymize received data for safe logging.
anonym = {}
for k in data:
if k not in {"ID", "PASSWORD", "wsid", "wspw"}:
anonym[k] = data[k]
- Keep all keys, but mask sensitive values.
- Do not raise on unexpected/missing keys.
"""
secrets = {"ID", "PASSWORD", "wsid", "wspw"}
return anonym
return {k: ("***" if k in secrets else v) for k, v in data.items()}
def remap_items(entities):
"""Remap items in query."""
items = {}
for item in entities:
if item in REMAP_ITEMS:
items[REMAP_ITEMS[item]] = entities[item]
def remap_items(entities: dict[str, str]) -> dict[str, str]:
"""Remap legacy (WU-style) payload field names into internal sensor keys.
return items
The station sends short/legacy field names (e.g. "tempf", "humidity"). Internally we use
stable keys from `const.py` (e.g. "outside_temp", "outside_humidity"). This function produces
a normalized dict that the rest of the integration can work with.
"""
return {
REMAP_ITEMS[key]: value for key, value in entities.items() if key in REMAP_ITEMS
}
def remap_wslink_items(entities):
"""Remap items in query for WSLink API."""
items = {}
for item in entities:
if item in REMAP_WSLINK_ITEMS:
items[REMAP_WSLINK_ITEMS[item]] = entities[item]
def remap_wslink_items(entities: dict[str, str]) -> dict[str, str]:
"""Remap WSLink payload field names into internal sensor keys.
return items
WSLink uses a different naming scheme than the legacy endpoint (e.g. "t1tem", "t1ws").
Just like `remap_items`, this function normalizes the payload to the integration's stable
internal keys.
"""
return {
REMAP_WSLINK_ITEMS[key]: value
for key, value in entities.items()
if key in REMAP_WSLINK_ITEMS
}
def loaded_sensors(config_entry: ConfigEntry) -> list | None:
"""Get loaded sensors."""
def loaded_sensors(config_entry: ConfigEntry) -> list[str]:
"""Return sensor keys currently enabled for this config entry.
Auto-discovery persists new keys into `config_entry.options[SENSORS_TO_LOAD]`. The sensor
platform uses this list to decide which entities to create.
"""
return config_entry.options.get(SENSORS_TO_LOAD) or []
def check_disabled(
hass: HomeAssistant, items, config_entry: ConfigEntry
) -> list | None:
"""Check if we have data for unloaded sensors.
items: dict[str, str], config_entry: ConfigEntry
) -> list[str] | None:
"""Detect payload fields that are not enabled yet (auto-discovery).
If so, then add sensor to load queue.
The integration supports "auto-discovery" of sensors: when the station starts sending a new
field, we can automatically enable and create the corresponding entity.
This helper compares the normalized payload keys (`items`) with the currently enabled sensor
keys stored in options (`SENSORS_TO_LOAD`) and returns the missing keys.
Returns:
- list[str] of newly discovered sensor keys (to be added/enabled), or
- None if no new keys were found.
Notes:
- Logging is controlled via `DEV_DBG` because payloads can arrive frequently.
Returns list of found sensors or None
"""
log: bool = config_entry.options.get(DEV_DBG)
log = checked_or(config_entry.options.get(DEV_DBG), bool, False)
entityFound: bool = False
_loaded_sensors = loaded_sensors(config_entry)
missing_sensors: list = []
_loaded_sensors: list[str] = loaded_sensors(config_entry)
missing_sensors: list[str] = []
for item in items:
if log:
@ -175,12 +204,57 @@ def wind_dir_to_text(deg: float) -> UnitOfDir | None:
Returns UnitOfDir or None
"""
if deg:
return AZIMUT[int(abs((float(deg) - 11.25) % 360) / 22.5)]
_deg = to_float(deg)
if _deg is not None:
_LOGGER.debug("wind_dir: %s", AZIMUT[int(abs((_deg - 11.25) % 360) / 22.5)])
return AZIMUT[int(abs((_deg - 11.25) % 360) / 22.5)]
return None
def battery_level(battery: int | str | None) -> UnitOfBat:
"""Return battery level.
WSLink payload values often arrive as strings (e.g. "0"/"1"), so we accept
both ints and strings and coerce to int before mapping.
Returns UnitOfBat
"""
level_map: dict[int, UnitOfBat] = {
0: UnitOfBat.LOW,
1: UnitOfBat.NORMAL,
}
if (battery is None) or (battery == ""):
return UnitOfBat.UNKNOWN
vi: int
if isinstance(battery, int):
vi = battery
else:
try:
vi = int(battery)
except ValueError:
return UnitOfBat.UNKNOWN
return level_map.get(vi, UnitOfBat.UNKNOWN)
def battery_level_to_icon(battery: UnitOfBat) -> str:
"""Return battery level in icon representation.
Returns str
"""
icons = {
UnitOfBat.LOW: "mdi:battery-low",
UnitOfBat.NORMAL: "mdi:battery",
}
return icons.get(battery, "mdi:battery-unknown")
def fahrenheit_to_celsius(fahrenheit: float) -> float:
"""Convert Fahrenheit to Celsius."""
return (fahrenheit - 32) * 5.0 / 9.0
@ -191,15 +265,62 @@ def celsius_to_fahrenheit(celsius: float) -> float:
return celsius * 9.0 / 5.0 + 32
def heat_index(data: Any, convert: bool = False) -> UnitOfTemperature:
def to_int(val: Any) -> int | None:
"""Convert int or string to int."""
if val is None:
return None
if isinstance(val, str) and val.strip() == "":
return None
try:
v = int(val)
except (TypeError, ValueError):
return None
else:
return v
def to_float(val: Any) -> float | None:
"""Convert int or string to float."""
if val is None:
return None
if isinstance(val, str) and val.strip() == "":
return None
try:
v = float(val)
except (TypeError, ValueError):
return None
else:
return v
def heat_index(
data: dict[str, int | float | str], convert: bool = False
) -> float | None:
"""Calculate heat index from temperature.
data: dict with temperature and humidity
convert: bool, convert recieved data from Celsius to Fahrenheit
"""
if (temp := to_float(data.get(OUTSIDE_TEMP))) is None:
_LOGGER.error(
"We are missing/invalid OUTSIDE TEMP (%s), cannot calculate wind chill index.",
temp,
)
return None
if (rh := to_float(data.get(OUTSIDE_HUMIDITY))) is None:
_LOGGER.error(
"We are missing/invalid OUTSIDE HUMIDITY (%s), cannot calculate wind chill index.",
rh,
)
return None
temp = float(data[OUTSIDE_TEMP])
rh = float(data[OUTSIDE_HUMIDITY])
adjustment = None
if convert:
@ -218,10 +339,10 @@ def heat_index(data: Any, convert: bool = False) -> UnitOfTemperature:
+ 0.00085282 * temp * rh * rh
- 0.00000199 * temp * temp * rh * rh
)
if rh < 13 and (temp in np.arange(80, 112, 0.1)):
if rh < 13 and (80 <= temp <= 112):
adjustment = ((13 - rh) / 4) * math.sqrt((17 - abs(temp - 95)) / 17)
if rh > 80 and (temp in np.arange(80, 87, 0.1)):
if rh > 80 and (80 <= temp <= 87):
adjustment = ((rh - 85) / 10) * ((87 - temp) / 5)
return round((full_index + adjustment if adjustment else full_index), 2)
@ -229,15 +350,30 @@ def heat_index(data: Any, convert: bool = False) -> UnitOfTemperature:
return simple
def chill_index(data: Any, convert: bool = False) -> UnitOfTemperature:
def chill_index(
data: dict[str, str | float | int], convert: bool = False
) -> float | None:
"""Calculate wind chill index from temperature and wind speed.
data: dict with temperature and wind speed
convert: bool, convert recieved data from Celsius to Fahrenheit
"""
temp = to_float(data.get(OUTSIDE_TEMP))
wind = to_float(data.get(WIND_SPEED))
temp = float(data[OUTSIDE_TEMP])
wind = float(data[WIND_SPEED])
if temp is None:
_LOGGER.error(
"We are missing/invalid OUTSIDE TEMP (%s), cannot calculate wind chill index.",
temp,
)
return None
if wind is None:
_LOGGER.error(
"We are missing/invalid WIND SPEED (%s), cannot calculate wind chill index.",
wind,
)
return None
if convert:
temp = celsius_to_fahrenheit(temp)
@ -254,107 +390,3 @@ def chill_index(data: Any, convert: bool = False) -> UnitOfTemperature:
if temp < 50 and wind > 3
else temp
)
def long_term_units_in_statistics_meta():
"""Get units in long term statitstics."""
if not Path(DATABASE_PATH).exists():
_LOGGER.error("Database file not found: %s", DATABASE_PATH)
return False
conn = sqlite3.connect(DATABASE_PATH)
db = conn.cursor()
try:
db.execute("""
SELECT statistic_id, unit_of_measurement from statistics_meta
WHERE statistic_id LIKE 'sensor.weather_station_sws%'
""")
rows = db.fetchall()
sensor_units = {
statistic_id: f"{statistic_id} ({unit})" for statistic_id, unit in rows
}
except sqlite3.Error as e:
_LOGGER.error("Error during data migration: %s", e)
finally:
conn.close()
return sensor_units
async def migrate_data(hass: HomeAssistant, sensor_id: str | None = None) -> bool:
"""Migrate data from mm/d to mm."""
_LOGGER.debug("Sensor %s is required for data migration", sensor_id)
updated_rows = 0
if not Path(DATABASE_PATH).exists():
_LOGGER.error("Database file not found: %s", DATABASE_PATH)
return False
conn = sqlite3.connect(DATABASE_PATH)
db = conn.cursor()
try:
_LOGGER.info(sensor_id)
db.execute(
"""
UPDATE statistics_meta
SET unit_of_measurement = 'mm'
WHERE statistic_id = ?
AND unit_of_measurement = 'mm/d';
""",
(sensor_id,),
)
updated_rows = db.rowcount
conn.commit()
_LOGGER.info(
"Data migration completed successfully. Updated rows: %s for %s",
updated_rows,
sensor_id,
)
except sqlite3.Error as e:
_LOGGER.error("Error during data migration: %s", e)
finally:
conn.close()
return updated_rows
def migrate_data_old(sensor_id: str | None = None):
"""Migrate data from mm/d to mm."""
updated_rows = 0
if not Path(DATABASE_PATH).exists():
_LOGGER.error("Database file not found: %s", DATABASE_PATH)
return False
conn = sqlite3.connect(DATABASE_PATH)
db = conn.cursor()
try:
_LOGGER.info(sensor_id)
db.execute(
"""
UPDATE statistics_meta
SET unit_of_measurement = 'mm'
WHERE statistic_id = ?
AND unit_of_measurement = 'mm/d';
""",
(sensor_id,),
)
updated_rows = db.rowcount
conn.commit()
_LOGGER.info(
"Data migration completed successfully. Updated rows: %s for %s",
updated_rows,
sensor_id,
)
except sqlite3.Error as e:
_LOGGER.error("Error during data migration: %s", e)
finally:
conn.close()
return updated_rows

View File

@ -3,17 +3,24 @@
from datetime import datetime, timedelta
import logging
from aiohttp.client import ClientResponse
from aiohttp.client_exceptions import ClientError
from py_typecheck import checked
from homeassistant.components import persistent_notification
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import (
PURGE_DATA,
WINDY_API_KEY,
WINDY_ENABLED,
WINDY_INVALID_KEY,
WINDY_LOGGER_ENABLED,
WINDY_MAX_RETRIES,
WINDY_NOT_INSERTED,
WINDY_STATION_ID,
WINDY_STATION_PW,
WINDY_SUCCESS,
WINDY_UNEXPECTED,
WINDY_URL,
@ -22,19 +29,38 @@ from .utils import update_options
_LOGGER = logging.getLogger(__name__)
RESPONSE_FOR_TEST = False
class WindyNotInserted(Exception):
"""NotInserted state."""
"""NotInserted state.
Possible variants are:
- station password is invalid
- station password does not match the station
- payload failed validation
"""
class WindySuccess(Exception):
"""WindySucces state."""
class WindyApiKeyError(Exception):
"""Windy API Key error."""
class WindyPasswordMissing(Exception):
"""Windy password is missing in query or Authorization header.
This should not happend, while we are checking if we have password set and do exits early.
"""
class WindyDuplicatePayloadDetected(Exception):
"""Duplicate payload detected."""
class WindyRateLimitExceeded(Exception):
"""Rate limit exceeded. Minimum interval is 5 minutes.
This should not happend in runnig integration.
Might be seen, if restart of HomeAssistant occured and we are not aware of previous update.
"""
def timed(minutes: int):
@ -52,40 +78,86 @@ class WindyPush:
"""Init."""
self.hass = hass
self.config = config
self.enabled: bool = self.config.options.get(WINDY_ENABLED, False)
self.last_status: str = "disabled" if not self.enabled else "idle"
self.last_error: str | None = None
self.last_attempt_at: str | None = None
""" lets wait for 1 minute to get initial data from station
and then try to push first data to Windy
"""
self.last_update = datetime.now()
self.next_update = datetime.now() + timed(minutes=1)
self.last_update: datetime = datetime.now()
self.next_update: datetime = datetime.now() + timed(minutes=1)
self.log = self.config.options.get(WINDY_LOGGER_ENABLED)
self.invalid_response_count = 0
self.log: bool = self.config.options.get(WINDY_LOGGER_ENABLED, False)
def verify_windy_response( # pylint: disable=useless-return
self,
response: str,
) -> WindyNotInserted | WindySuccess | WindyApiKeyError | None:
# Lets chcek if Windy server is responding right.
# Otherwise, try 3 times and then disable resending.
self.invalid_response_count: int = 0
# Refactored responses verification.
#
# We now comply to API at https://stations.windy.com/api-reference
def verify_windy_response(self, response: ClientResponse):
"""Verify answer form Windy."""
if self.log:
_LOGGER.info("Windy response raw response: %s", response)
if self.log and response:
_LOGGER.info("Windy raw response: %s", response.text)
if "NOTICE" in response:
raise WindyNotInserted
if "SUCCESS" in response:
if response.status == 200:
raise WindySuccess
if "Invalid API key" in response:
raise WindyApiKeyError
if response.status == 400:
raise WindyNotInserted
if "Unauthorized" in response:
raise WindyApiKeyError
if response.status == 401:
raise WindyPasswordMissing
return None
if response.status == 409:
raise WindyDuplicatePayloadDetected
async def push_data_to_windy(self, data):
if response.status == 429:
raise WindyRateLimitExceeded
def _covert_wslink_to_pws(self, indata: dict[str, str]) -> dict[str, str]:
"""Convert WSLink API data to Windy API data protocol."""
if "t1ws" in indata:
indata["wind"] = indata.pop("t1ws")
if "t1wgust" in indata:
indata["gust"] = indata.pop("t1wgust")
if "t1wdir" in indata:
indata["winddir"] = indata.pop("t1wdir")
if "t1hum" in indata:
indata["humidity"] = indata.pop("t1hum")
if "t1dew" in indata:
indata["dewpoint"] = indata.pop("t1dew")
if "t1tem" in indata:
indata["temp"] = indata.pop("t1tem")
if "rbar" in indata:
indata["mbar"] = indata.pop("rbar")
if "t1rainhr" in indata:
indata["precip"] = indata.pop("t1rainhr")
if "t1uvi" in indata:
indata["uv"] = indata.pop("t1uvi")
if "t1solrad" in indata:
indata["solarradiation"] = indata.pop("t1solrad")
return indata
async def _disable_windy(self, reason: str) -> None:
"""Disable Windy resending."""
self.enabled = False
self.last_status = "disabled"
self.last_error = reason
if not await update_options(self.hass, self.config, WINDY_ENABLED, False):
_LOGGER.debug("Failed to set Windy options to false.")
persistent_notification.create(self.hass, reason, "Windy resending disabled.")
async def push_data_to_windy(
self, data: dict[str, str], wslink: bool = False
) -> bool:
"""Pushes weather data do Windy stations.
Interval is 5 minutes, otherwise Windy would not accepts data.
@ -94,7 +166,32 @@ class WindyPush:
from station. But we need to do some clean up.
"""
text_for_test = None
# First check if we have valid credentials, before any data manipulation.
self.enabled = self.config.options.get(WINDY_ENABLED, False)
self.last_attempt_at = datetime.now().isoformat()
self.last_error = None
if (
windy_station_id := checked(self.config.options.get(WINDY_STATION_ID), str)
) is None:
_LOGGER.error("Windy API key is not provided! Check your configuration.")
self.last_status = "config_error"
await self._disable_windy(
"Windy API key is not provided. Resending is disabled for now. Reconfigure your integration."
)
return False
if (
windy_station_pw := checked(self.config.options.get(WINDY_STATION_PW), str)
) is None:
_LOGGER.error(
"Windy station password is missing! Check your configuration."
)
self.last_status = "config_error"
await self._disable_windy(
"Windy password is not provided. Resending is disabled for now. Reconfigure your integration."
)
return False
if self.log:
_LOGGER.info(
@ -104,61 +201,116 @@ class WindyPush:
)
if self.next_update > datetime.now():
self.last_status = "rate_limited_local"
return False
purged_data = dict(data)
purged_data = data.copy()
for purge in PURGE_DATA:
if purge in purged_data:
purged_data.pop(purge)
_ = purged_data.pop(purge)
if "dewptf" in purged_data:
dewpoint = round(((float(purged_data.pop("dewptf")) - 32) / 1.8), 1)
purged_data["dewpoint"] = str(dewpoint)
if wslink:
# WSLink -> Windy params
purged_data = self._covert_wslink_to_pws(purged_data)
windy_api_key = self.config.options.get(WINDY_API_KEY)
request_url = f"{WINDY_URL}{windy_api_key}"
request_url = f"{WINDY_URL}"
purged_data["id"] = windy_station_id
purged_data["time"] = "now"
headers = {"Authorization": f"Bearer {windy_station_pw}"}
if self.log:
_LOGGER.info("Dataset for windy: %s", purged_data)
session = async_get_clientsession(self.hass, verify_ssl=False)
session = async_get_clientsession(self.hass)
try:
async with session.get(request_url, params=purged_data) as resp:
status = await resp.text()
async with session.get(
request_url, params=purged_data, headers=headers
) as resp:
try:
self.verify_windy_response(status)
self.verify_windy_response(response=resp)
except WindyNotInserted:
# log despite of settings
_LOGGER.error(WINDY_NOT_INSERTED)
self.last_status = "not_inserted"
self.last_error = WINDY_NOT_INSERTED
self.invalid_response_count += 1
text_for_test = WINDY_NOT_INSERTED
except WindyApiKeyError:
# log despite of settings
_LOGGER.error(
"%s Max retries before disable resend function: %s",
WINDY_NOT_INSERTED,
(WINDY_MAX_RETRIES - self.invalid_response_count),
)
except WindyPasswordMissing:
# log despite of settings
self.last_status = "auth_error"
self.last_error = WINDY_INVALID_KEY
_LOGGER.critical(WINDY_INVALID_KEY)
text_for_test = WINDY_INVALID_KEY
update_options(self.hass, self.config, WINDY_ENABLED, False)
await self._disable_windy(
reason="Windy password is missing in payload or Authorization header. Resending is disabled for now. Reconfigure your Windy settings."
)
except WindyDuplicatePayloadDetected:
self.last_status = "duplicate"
self.last_error = "Duplicate payload detected by Windy server."
_LOGGER.critical(
"Duplicate payload detected by Windy server. Will try again later. Max retries before disabling resend function: %s",
(WINDY_MAX_RETRIES - self.invalid_response_count),
)
self.invalid_response_count += 1
except WindyRateLimitExceeded:
# log despite of settings
self.last_status = "rate_limited_remote"
self.last_error = "Windy rate limit exceeded."
_LOGGER.critical(
"Windy responded with WindyRateLimitExceeded, this should happend only on restarting Home Assistant when we lost track of last send time. Pause resend for next 5 minutes."
)
self.next_update = datetime.now() + timedelta(minutes=5)
except WindySuccess:
# reset invalid_response_count
self.invalid_response_count = 0
self.last_status = "ok"
self.last_error = None
if self.log:
_LOGGER.info(WINDY_SUCCESS)
text_for_test = WINDY_SUCCESS
else:
self.last_status = "unexpected_response"
self.last_error = "Unexpected response from Windy."
if self.log:
self.invalid_response_count += 1
_LOGGER.debug(
"Unexpected response from Windy. Max retries before disabling resend function: %s",
(WINDY_MAX_RETRIES - self.invalid_response_count),
)
finally:
if self.invalid_response_count >= 3:
_LOGGER.critical(
"Invalid response from Windy 3 times. Disabling resend option."
)
await self._disable_windy(
reason="Unable to send data to Windy (3 times). Disabling resend option for now. Please check your Windy configuration and enable this feature afterwards."
)
except session.ClientError as ex:
_LOGGER.critical("Invalid response from Windy: %s", str(ex))
except ClientError as ex:
self.last_status = "client_error"
self.last_error = str(ex)
_LOGGER.critical(
"Invalid response from Windy: %s. Will try again later, max retries before disabling resend function: %s",
str(ex),
(WINDY_MAX_RETRIES - self.invalid_response_count),
)
self.invalid_response_count += 1
if self.invalid_response_count > 3:
if self.invalid_response_count >= WINDY_MAX_RETRIES:
_LOGGER.critical(WINDY_UNEXPECTED)
text_for_test = WINDY_UNEXPECTED
update_options(self.hass, self.config, WINDY_ENABLED, False)
await self._disable_windy(
reason="Invalid response from Windy 3 times. Disabling resending option."
)
self.last_update = datetime.now()
self.next_update = self.last_update + timed(minutes=5)
if self.log:
_LOGGER.info("Next update: %s", str(self.next_update))
if RESPONSE_FOR_TEST and text_for_test:
return text_for_test
return None
return True

View File

@ -1,4 +1,6 @@
{
"name": "Sencor SWS 12500 Weather station",
"render_readme": true
"filename": "weather-station.zip",
"render_readme": true,
"zip_release": true
}

38
tests/conftest.py Normal file
View File

@ -0,0 +1,38 @@
"""Pytest configuration for tests under `dev/tests`.
Goals:
- Make `custom_components.*` importable.
- Keep this file lightweight and avoid global HA test-harness side effects.
Repository layout:
- Root custom components: `SWS-12500/custom_components/...` (symlinked to `dev/custom_components/...`)
- Integration sources: `SWS-12500/dev/custom_components/...`
Note:
Some tests use lightweight `hass` stubs (e.g. SimpleNamespace) that are not compatible with
Home Assistant's full test fixtures. Do NOT enable HA-only fixtures globally here.
Instead, request such fixtures (e.g. `enable_custom_integrations`) explicitly in the specific
tests that need HA's integration loader / flow managers.
"""
from __future__ import annotations
from pathlib import Path
import sys
def pytest_configure() -> None:
"""Adjust sys.path so imports and HA loader discovery work in tests."""
repo_root = Path(__file__).resolve().parents[2] # .../SWS-12500
dev_root = repo_root / "dev"
# Ensure the repo root is importable so HA can find `custom_components/<domain>/manifest.json`.
repo_root_str = str(repo_root)
if repo_root_str not in sys.path:
sys.path.insert(0, repo_root_str)
# Also ensure `dev/` is importable for direct imports from dev tooling/tests.
dev_root_str = str(dev_root)
if dev_root_str not in sys.path:
sys.path.insert(0, dev_root_str)

383
tests/test_config_flow.py Normal file
View File

@ -0,0 +1,383 @@
from __future__ import annotations
from unittest.mock import patch
import pytest
from pytest_homeassistant_custom_component.common import MockConfigEntry
from custom_components.sws12500.const import (
API_ID,
API_KEY,
DEV_DBG,
DOMAIN,
ECOWITT_ENABLED,
ECOWITT_WEBHOOK_ID,
INVALID_CREDENTIALS,
POCASI_CZ_API_ID,
POCASI_CZ_API_KEY,
POCASI_CZ_ENABLED,
POCASI_CZ_LOGGER_ENABLED,
POCASI_CZ_SEND_INTERVAL,
POCASI_CZ_SEND_MINIMUM,
WINDY_ENABLED,
WINDY_LOGGER_ENABLED,
WINDY_STATION_ID,
WINDY_STATION_PW,
WSLINK,
)
from homeassistant import config_entries
@pytest.mark.asyncio
async def test_config_flow_user_form_then_create_entry(
hass, enable_custom_integrations
) -> None:
"""Online HA: config flow shows form then creates entry and options."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == "form"
assert result["step_id"] == "user"
user_input = {
API_ID: "my_id",
API_KEY: "my_key",
WSLINK: False,
DEV_DBG: False,
}
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=user_input
)
assert result2["type"] == "create_entry"
assert result2["title"] == DOMAIN
assert result2["data"] == user_input
assert result2["options"] == user_input
@pytest.mark.asyncio
async def test_config_flow_user_invalid_credentials_api_id(
hass, enable_custom_integrations
) -> None:
"""API_ID in INVALID_CREDENTIALS -> error on API_ID."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == "form"
user_input = {
API_ID: INVALID_CREDENTIALS[0],
API_KEY: "ok_key",
WSLINK: False,
DEV_DBG: False,
}
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=user_input
)
assert result2["type"] == "form"
assert result2["step_id"] == "user"
assert result2["errors"][API_ID] == "valid_credentials_api"
@pytest.mark.asyncio
async def test_config_flow_user_invalid_credentials_api_key(
hass, enable_custom_integrations
) -> None:
"""API_KEY in INVALID_CREDENTIALS -> error on API_KEY."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == "form"
user_input = {
API_ID: "ok_id",
API_KEY: INVALID_CREDENTIALS[0],
WSLINK: False,
DEV_DBG: False,
}
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=user_input
)
assert result2["type"] == "form"
assert result2["step_id"] == "user"
assert result2["errors"][API_KEY] == "valid_credentials_key"
@pytest.mark.asyncio
async def test_config_flow_user_invalid_credentials_match(
hass, enable_custom_integrations
) -> None:
"""API_KEY == API_ID -> base error."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == "form"
user_input = {
API_ID: "same",
API_KEY: "same",
WSLINK: False,
DEV_DBG: False,
}
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=user_input
)
assert result2["type"] == "form"
assert result2["step_id"] == "user"
assert result2["errors"]["base"] == "valid_credentials_match"
@pytest.mark.asyncio
async def test_options_flow_init_menu(hass, enable_custom_integrations) -> None:
"""Options flow shows menu with expected steps."""
entry = MockConfigEntry(domain=DOMAIN, data={}, options={})
entry.add_to_hass(hass)
result = await hass.config_entries.options.async_init(entry.entry_id)
assert result["type"] == "menu"
assert result["step_id"] == "init"
assert set(result["menu_options"]) == {"basic", "ecowitt", "windy", "pocasi"}
@pytest.mark.asyncio
async def test_options_flow_basic_validation_and_create_entry(
hass, enable_custom_integrations
) -> None:
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={
API_ID: "old",
API_KEY: "oldkey",
WSLINK: False,
DEV_DBG: False,
},
)
entry.add_to_hass(hass)
init = await hass.config_entries.options.async_init(entry.entry_id)
assert init["type"] == "menu"
form = await hass.config_entries.options.async_configure(
init["flow_id"], user_input={"next_step_id": "basic"}
)
assert form["type"] == "form"
assert form["step_id"] == "basic"
# Cover invalid API_ID branch in options flow basic step.
bad_api_id = await hass.config_entries.options.async_configure(
init["flow_id"],
user_input={
API_ID: INVALID_CREDENTIALS[0],
API_KEY: "ok_key",
WSLINK: False,
DEV_DBG: False,
},
)
assert bad_api_id["type"] == "form"
assert bad_api_id["step_id"] == "basic"
assert bad_api_id["errors"][API_ID] == "valid_credentials_api"
# Cover invalid API_KEY branch in options flow basic step.
bad_api_key = await hass.config_entries.options.async_configure(
init["flow_id"],
user_input={
API_ID: "ok_id",
API_KEY: INVALID_CREDENTIALS[0],
WSLINK: False,
DEV_DBG: False,
},
)
assert bad_api_key["type"] == "form"
assert bad_api_key["step_id"] == "basic"
assert bad_api_key["errors"][API_KEY] == "valid_credentials_key"
bad = await hass.config_entries.options.async_configure(
init["flow_id"],
user_input={API_ID: "same", API_KEY: "same", WSLINK: False, DEV_DBG: False},
)
assert bad["type"] == "form"
assert bad["step_id"] == "basic"
assert bad["errors"]["base"] == "valid_credentials_match"
good = await hass.config_entries.options.async_configure(
init["flow_id"],
user_input={API_ID: "new", API_KEY: "newkey", WSLINK: True, DEV_DBG: True},
)
assert good["type"] == "create_entry"
assert good["title"] == DOMAIN
assert good["data"][API_ID] == "new"
assert good["data"][API_KEY] == "newkey"
assert good["data"][WSLINK] is True
assert good["data"][DEV_DBG] is True
@pytest.mark.asyncio
async def test_options_flow_windy_requires_keys_when_enabled(
hass, enable_custom_integrations
) -> None:
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={
WINDY_ENABLED: False,
WINDY_LOGGER_ENABLED: False,
WINDY_STATION_ID: "",
WINDY_STATION_PW: "",
},
)
entry.add_to_hass(hass)
init = await hass.config_entries.options.async_init(entry.entry_id)
form = await hass.config_entries.options.async_configure(
init["flow_id"], user_input={"next_step_id": "windy"}
)
assert form["type"] == "form"
assert form["step_id"] == "windy"
bad = await hass.config_entries.options.async_configure(
init["flow_id"],
user_input={
WINDY_ENABLED: True,
WINDY_LOGGER_ENABLED: False,
WINDY_STATION_ID: "",
WINDY_STATION_PW: "",
},
)
assert bad["type"] == "form"
assert bad["step_id"] == "windy"
assert bad["errors"][WINDY_STATION_ID] == "windy_key_required"
good = await hass.config_entries.options.async_configure(
init["flow_id"],
user_input={
WINDY_ENABLED: True,
WINDY_LOGGER_ENABLED: True,
WINDY_STATION_ID: "sid",
WINDY_STATION_PW: "spw",
},
)
assert good["type"] == "create_entry"
assert good["data"][WINDY_ENABLED] is True
@pytest.mark.asyncio
async def test_options_flow_pocasi_validation_minimum_interval_and_required_keys(
hass,
enable_custom_integrations,
) -> None:
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={
POCASI_CZ_API_ID: "",
POCASI_CZ_API_KEY: "",
POCASI_CZ_ENABLED: False,
POCASI_CZ_LOGGER_ENABLED: False,
POCASI_CZ_SEND_INTERVAL: 30,
},
)
entry.add_to_hass(hass)
init = await hass.config_entries.options.async_init(entry.entry_id)
form = await hass.config_entries.options.async_configure(
init["flow_id"], user_input={"next_step_id": "pocasi"}
)
assert form["type"] == "form"
assert form["step_id"] == "pocasi"
bad = await hass.config_entries.options.async_configure(
init["flow_id"],
user_input={
POCASI_CZ_API_ID: "",
POCASI_CZ_API_KEY: "",
POCASI_CZ_ENABLED: True,
POCASI_CZ_LOGGER_ENABLED: False,
POCASI_CZ_SEND_INTERVAL: POCASI_CZ_SEND_MINIMUM - 1,
},
)
assert bad["type"] == "form"
assert bad["step_id"] == "pocasi"
assert bad["errors"][POCASI_CZ_SEND_INTERVAL] == "pocasi_send_minimum"
assert bad["errors"][POCASI_CZ_API_ID] == "pocasi_id_required"
assert bad["errors"][POCASI_CZ_API_KEY] == "pocasi_key_required"
good = await hass.config_entries.options.async_configure(
init["flow_id"],
user_input={
POCASI_CZ_API_ID: "pid",
POCASI_CZ_API_KEY: "pkey",
POCASI_CZ_ENABLED: True,
POCASI_CZ_LOGGER_ENABLED: True,
POCASI_CZ_SEND_INTERVAL: POCASI_CZ_SEND_MINIMUM,
},
)
assert good["type"] == "create_entry"
assert good["data"][POCASI_CZ_ENABLED] is True
@pytest.mark.asyncio
async def test_options_flow_ecowitt_uses_get_url_placeholders_and_webhook_default(
hass,
enable_custom_integrations,
) -> None:
"""Online HA: ecowitt step uses get_url() placeholders and secrets token when webhook id missing."""
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={
ECOWITT_WEBHOOK_ID: "",
ECOWITT_ENABLED: False,
},
)
entry.add_to_hass(hass)
init = await hass.config_entries.options.async_init(entry.entry_id)
assert init["type"] == "menu"
# NOTE:
# The integration currently attempts to mutate `yarl.URL.host` when it is missing:
#
# url: URL = URL(get_url(self.hass))
# if not url.host:
# url.host = "UNKNOWN"
#
# With current yarl versions, `URL.host` is a cached, read-only property, so this
# raises `AttributeError: cached property is read-only`.
#
# We assert that behavior explicitly to keep coverage deterministic and document the
# runtime incompatibility. If the integration code is updated to handle missing hosts
# without mutation (e.g. using `url.raw_host` or building placeholders without setting
# attributes), this assertion should be updated accordingly.
with patch(
"custom_components.sws12500.config_flow.get_url",
return_value="http://",
):
with pytest.raises(AttributeError):
await hass.config_entries.options.async_configure(
init["flow_id"], user_input={"next_step_id": "ecowitt"}
)
# Second call uses a normal URL and completes the flow.
with patch(
"custom_components.sws12500.config_flow.get_url",
return_value="http://example.local:8123",
):
form = await hass.config_entries.options.async_configure(
init["flow_id"], user_input={"next_step_id": "ecowitt"}
)
assert form["type"] == "form"
assert form["step_id"] == "ecowitt"
placeholders = form.get("description_placeholders") or {}
assert placeholders["url"] == "example.local"
assert placeholders["port"] == "8123"
assert placeholders["webhook_id"] # generated
done = await hass.config_entries.options.async_configure(
init["flow_id"],
user_input={
ECOWITT_WEBHOOK_ID: placeholders["webhook_id"],
ECOWITT_ENABLED: True,
},
)
assert done["type"] == "create_entry"
assert done["data"][ECOWITT_ENABLED] is True

8
tests/test_const.py Normal file
View File

@ -0,0 +1,8 @@
from custom_components.sws12500.const import DEFAULT_URL, DOMAIN, WINDY_URL, WSLINK_URL
def test_const_values():
assert DOMAIN == "sws12500"
assert DEFAULT_URL == "/weatherstation/updateweatherstation.php"
assert WSLINK_URL == "/data/upload.php"
assert WINDY_URL == "https://stations.windy.com/api/v2/observation/update"

13
tests/test_data.py Normal file
View File

@ -0,0 +1,13 @@
from custom_components.sws12500.data import (
ENTRY_ADD_ENTITIES,
ENTRY_COORDINATOR,
ENTRY_DESCRIPTIONS,
ENTRY_LAST_OPTIONS,
)
def test_data_constants():
assert ENTRY_COORDINATOR == "coordinator"
assert ENTRY_ADD_ENTITIES == "async_add_entities"
assert ENTRY_DESCRIPTIONS == "sensor_descriptions"
assert ENTRY_LAST_OPTIONS == "last_options"

95
tests/test_init.py Normal file
View File

@ -0,0 +1,95 @@
"""Integration init tests using Home Assistant pytest fixtures.
These tests rely on `pytest-homeassistant-custom-component` to provide:
- `hass` fixture (running Home Assistant instance)
- `MockConfigEntry` helper for config entries
They validate that the integration can set up a config entry and that the
coordinator is created and stored in `hass.data`.
Note:
This integration registers aiohttp routes via `hass.http.app.router`. In this
test environment, `hass.http` may not be set up, so we patch route registration
to keep these tests focused on setup logic.
"""
from __future__ import annotations
from unittest.mock import AsyncMock
import pytest
from pytest_homeassistant_custom_component.common import MockConfigEntry
from custom_components.sws12500 import WeatherDataUpdateCoordinator, async_setup_entry
from custom_components.sws12500.const import DOMAIN
@pytest.fixture
def config_entry() -> MockConfigEntry:
"""Create a minimal config entry for the integration."""
return MockConfigEntry(domain=DOMAIN, data={}, options={})
async def test_async_setup_entry_creates_runtime_state(
hass, config_entry: MockConfigEntry, monkeypatch
):
"""Setting up a config entry should succeed and populate hass.data."""
config_entry.add_to_hass(hass)
# `async_setup_entry` calls `register_path`, which needs `hass.http`.
# Patch it out so the test doesn't depend on aiohttp being initialized.
monkeypatch.setattr(
"custom_components.sws12500.register_path",
lambda _hass, _coordinator, _coordinator_h, _entry: True,
)
# Avoid depending on Home Assistant integration loader in this test.
# This keeps the test focused on our integration's setup behavior.
monkeypatch.setattr(
hass.config_entries,
"async_forward_entry_setups",
AsyncMock(return_value=True),
)
result = await async_setup_entry(hass, config_entry)
assert result is True
assert DOMAIN in hass.data
assert config_entry.entry_id in hass.data[DOMAIN]
assert isinstance(hass.data[DOMAIN][config_entry.entry_id], dict)
async def test_async_setup_entry_forwards_sensor_platform(
hass, config_entry: MockConfigEntry, monkeypatch
):
"""The integration should forward entry setups to the sensor platform."""
config_entry.add_to_hass(hass)
# `async_setup_entry` calls `register_path`, which needs `hass.http`.
# Patch it out so the test doesn't depend on aiohttp being initialized.
monkeypatch.setattr(
"custom_components.sws12500.register_path",
lambda _hass, _coordinator, _coordinator_h, _entry: True,
)
# Patch forwarding so we don't need to load real platforms for this unit/integration test.
hass.config_entries.async_forward_entry_setups = AsyncMock(return_value=True)
result = await async_setup_entry(hass, config_entry)
assert result is True
hass.config_entries.async_forward_entry_setups.assert_awaited()
forwarded_entry, forwarded_platforms = (
hass.config_entries.async_forward_entry_setups.await_args.args
)
assert forwarded_entry.entry_id == config_entry.entry_id
assert "sensor" in list(forwarded_platforms)
async def test_weather_data_update_coordinator_can_be_constructed(
hass, config_entry: MockConfigEntry
):
"""Coordinator should be constructible with a real hass fixture."""
coordinator = WeatherDataUpdateCoordinator(hass, config_entry)
assert coordinator.hass is hass
assert coordinator.config is config_entry

View File

@ -0,0 +1,458 @@
from __future__ import annotations
from dataclasses import dataclass
from types import SimpleNamespace
from typing import Any
from unittest.mock import AsyncMock, MagicMock
from aiohttp.web_exceptions import HTTPUnauthorized
import pytest
from pytest_homeassistant_custom_component.common import MockConfigEntry
from custom_components.sws12500 import (
HealthCoordinator,
IncorrectDataError,
WeatherDataUpdateCoordinator,
async_setup_entry,
async_unload_entry,
register_path,
update_listener,
)
from custom_components.sws12500.const import (
API_ID,
API_KEY,
DEFAULT_URL,
DOMAIN,
HEALTH_URL,
SENSORS_TO_LOAD,
WSLINK,
WSLINK_URL,
)
from custom_components.sws12500.data import ENTRY_COORDINATOR, ENTRY_LAST_OPTIONS
@dataclass(slots=True)
class _RequestStub:
"""Minimal aiohttp Request stub used by `received_data`."""
query: dict[str, Any]
async def post(self) -> dict[str, Any]:
return {}
class _RouterStub:
"""Router stub that records route registrations."""
def __init__(self) -> None:
self.add_get_calls: list[tuple[str, Any]] = []
self.add_post_calls: list[tuple[str, Any]] = []
self.raise_on_add: Exception | None = None
def add_get(self, path: str, handler: Any, **_kwargs: Any) -> Any:
if self.raise_on_add is not None:
raise self.raise_on_add
self.add_get_calls.append((path, handler))
return SimpleNamespace(method="GET")
def add_post(self, path: str, handler: Any, **_kwargs: Any) -> Any:
if self.raise_on_add is not None:
raise self.raise_on_add
self.add_post_calls.append((path, handler))
return SimpleNamespace(method="POST")
@pytest.fixture
def hass_with_http(hass):
"""Provide a real HA hass fixture augmented with a stub http router."""
router = _RouterStub()
hass.http = SimpleNamespace(app=SimpleNamespace(router=router))
return hass
@pytest.mark.asyncio
async def test_register_path_registers_routes_and_stores_dispatcher(hass_with_http):
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={
API_ID: "id",
API_KEY: "key",
WSLINK: False,
},
)
entry.add_to_hass(hass_with_http)
coordinator = WeatherDataUpdateCoordinator(hass_with_http, entry)
coordinator_health = HealthCoordinator(hass_with_http, entry)
ok = register_path(hass_with_http, coordinator, coordinator_health, entry)
assert ok is True
# Router registrations
router: _RouterStub = hass_with_http.http.app.router
assert [p for (p, _h) in router.add_get_calls] == [
DEFAULT_URL,
WSLINK_URL,
HEALTH_URL,
]
assert [p for (p, _h) in router.add_post_calls] == [WSLINK_URL]
# Dispatcher stored
assert DOMAIN in hass_with_http.data
assert "routes" in hass_with_http.data[DOMAIN]
routes = hass_with_http.data[DOMAIN]["routes"]
assert routes is not None
# show_enabled() should return a string
assert isinstance(routes.show_enabled(), str)
@pytest.mark.asyncio
async def test_register_path_raises_config_entry_not_ready_on_router_runtime_error(
hass_with_http,
):
from homeassistant.exceptions import ConfigEntryNotReady
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={
API_ID: "id",
API_KEY: "key",
WSLINK: False,
},
)
entry.add_to_hass(hass_with_http)
coordinator = WeatherDataUpdateCoordinator(hass_with_http, entry)
coordinator_health = HealthCoordinator(hass_with_http, entry)
# Make router raise RuntimeError on add
router: _RouterStub = hass_with_http.http.app.router
router.raise_on_add = RuntimeError("router broken")
with pytest.raises(ConfigEntryNotReady):
register_path(hass_with_http, coordinator, coordinator_health, entry)
@pytest.mark.asyncio
async def test_register_path_checked_hass_data_wrong_type_raises_config_entry_not_ready(
hass_with_http,
):
"""Cover register_path branch where `checked(hass.data[DOMAIN], dict)` returns None."""
from homeassistant.exceptions import ConfigEntryNotReady
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={
API_ID: "id",
API_KEY: "key",
WSLINK: False,
},
)
entry.add_to_hass(hass_with_http)
coordinator = WeatherDataUpdateCoordinator(hass_with_http, entry)
coordinator_health = HealthCoordinator(hass_with_http, entry)
# Force wrong type under DOMAIN so `checked(..., dict)` fails.
hass_with_http.data[DOMAIN] = []
with pytest.raises(ConfigEntryNotReady):
register_path(hass_with_http, coordinator, coordinator_health, entry)
@pytest.mark.asyncio
async def test_async_setup_entry_creates_entry_dict_and_coordinator_and_forwards_platforms(
hass_with_http,
monkeypatch,
):
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={API_ID: "id", API_KEY: "key", WSLINK: False},
)
entry.add_to_hass(hass_with_http)
# Avoid loading actual platforms via HA loader.
monkeypatch.setattr(
hass_with_http.config_entries,
"async_forward_entry_setups",
AsyncMock(return_value=True),
)
ok = await async_setup_entry(hass_with_http, entry)
assert ok is True
# Runtime storage exists and is a dict
assert DOMAIN in hass_with_http.data
assert entry.entry_id in hass_with_http.data[DOMAIN]
entry_data = hass_with_http.data[DOMAIN][entry.entry_id]
assert isinstance(entry_data, dict)
# Coordinator stored and last options snapshot stored
assert isinstance(entry_data.get(ENTRY_COORDINATOR), WeatherDataUpdateCoordinator)
assert isinstance(entry_data.get(ENTRY_LAST_OPTIONS), dict)
# Forwarded setups invoked
hass_with_http.config_entries.async_forward_entry_setups.assert_awaited()
@pytest.mark.asyncio
async def test_async_setup_entry_fatal_when_register_path_returns_false(
hass_with_http, monkeypatch
):
"""Cover the fatal branch when `register_path` returns False.
async_setup_entry does:
routes_enabled = register_path(...)
if not routes_enabled: raise PlatformNotReady
"""
from homeassistant.exceptions import PlatformNotReady
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={API_ID: "id", API_KEY: "key", WSLINK: False},
)
entry.add_to_hass(hass_with_http)
# Ensure there are no pre-registered routes so async_setup_entry calls register_path.
hass_with_http.data.setdefault(DOMAIN, {})
hass_with_http.data[DOMAIN].pop("routes", None)
# Force register_path to return False
monkeypatch.setattr(
"custom_components.sws12500.register_path",
lambda _hass, _coordinator, _coordinator_h, _entry: False,
)
# Forwarding shouldn't be reached; patch anyway to avoid accidental loader calls.
monkeypatch.setattr(
hass_with_http.config_entries,
"async_forward_entry_setups",
AsyncMock(return_value=True),
)
with pytest.raises(PlatformNotReady):
await async_setup_entry(hass_with_http, entry)
@pytest.mark.asyncio
async def test_async_setup_entry_reuses_existing_coordinator_and_switches_routes(
hass_with_http,
monkeypatch,
):
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={API_ID: "id", API_KEY: "key", WSLINK: False},
)
entry.add_to_hass(hass_with_http)
# Pretend setup already happened and a coordinator exists
hass_with_http.data.setdefault(DOMAIN, {})
existing_coordinator = WeatherDataUpdateCoordinator(hass_with_http, entry)
hass_with_http.data[DOMAIN][entry.entry_id] = {
ENTRY_COORDINATOR: existing_coordinator,
ENTRY_LAST_OPTIONS: dict(entry.options),
}
# Provide pre-registered routes dispatcher
routes = hass_with_http.data[DOMAIN].get("routes")
if routes is None:
# Create a dispatcher via register_path once
coordinator_health = HealthCoordinator(hass_with_http, entry)
register_path(hass_with_http, existing_coordinator, coordinator_health, entry)
routes = hass_with_http.data[DOMAIN]["routes"]
# Turn on WSLINK to trigger dispatcher switching.
# ConfigEntry.options cannot be changed directly; use async_update_entry.
hass_with_http.config_entries.async_update_entry(
entry, options={**dict(entry.options), WSLINK: True}
)
# Avoid loading actual platforms via HA loader.
monkeypatch.setattr(
hass_with_http.config_entries,
"async_forward_entry_setups",
AsyncMock(return_value=True),
)
ok = await async_setup_entry(hass_with_http, entry)
assert ok is True
# Coordinator reused (same object)
entry_data = hass_with_http.data[DOMAIN][entry.entry_id]
assert entry_data[ENTRY_COORDINATOR] is existing_coordinator
@pytest.mark.asyncio
async def test_update_listener_skips_reload_when_only_sensors_to_load_changes(
hass_with_http,
):
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={API_ID: "id", API_KEY: "key", SENSORS_TO_LOAD: ["a"]},
)
entry.add_to_hass(hass_with_http)
# Seed hass.data snapshot
hass_with_http.data.setdefault(DOMAIN, {})
hass_with_http.data[DOMAIN][entry.entry_id] = {
# Seed the full old options snapshot. If we only store SENSORS_TO_LOAD here,
# update_listener will detect differences for other keys (e.g. auth keys) and reload.
ENTRY_LAST_OPTIONS: dict(entry.options),
}
hass_with_http.config_entries.async_reload = AsyncMock()
# Only SENSORS_TO_LOAD changes.
# ConfigEntry.options cannot be changed directly; use async_update_entry.
hass_with_http.config_entries.async_update_entry(
entry, options={**dict(entry.options), SENSORS_TO_LOAD: ["a", "b"]}
)
await update_listener(hass_with_http, entry)
hass_with_http.config_entries.async_reload.assert_not_awaited()
# Snapshot should be updated
entry_data = hass_with_http.data[DOMAIN][entry.entry_id]
assert entry_data[ENTRY_LAST_OPTIONS] == dict(entry.options)
@pytest.mark.asyncio
async def test_update_listener_triggers_reload_when_other_option_changes(
hass_with_http,
monkeypatch,
):
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={API_ID: "id", API_KEY: "key", SENSORS_TO_LOAD: ["a"], WSLINK: False},
)
entry.add_to_hass(hass_with_http)
hass_with_http.data.setdefault(DOMAIN, {})
hass_with_http.data[DOMAIN][entry.entry_id] = {
ENTRY_LAST_OPTIONS: dict(entry.options),
}
hass_with_http.config_entries.async_reload = AsyncMock(return_value=True)
# Change a different option.
# ConfigEntry.options cannot be changed directly; use async_update_entry.
hass_with_http.config_entries.async_update_entry(
entry, options={**dict(entry.options), WSLINK: True}
)
info = MagicMock()
monkeypatch.setattr("custom_components.sws12500._LOGGER.info", info)
await update_listener(hass_with_http, entry)
hass_with_http.config_entries.async_reload.assert_awaited_once_with(entry.entry_id)
info.assert_called()
@pytest.mark.asyncio
async def test_update_listener_missing_snapshot_stores_current_options_then_reloads(
hass_with_http,
):
"""Cover update_listener branch where the options snapshot is missing/invalid.
This hits:
entry_data[ENTRY_LAST_OPTIONS] = dict(entry.options)
and then proceeds to reload.
"""
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={API_ID: "id", API_KEY: "key", SENSORS_TO_LOAD: ["a"], WSLINK: False},
)
entry.add_to_hass(hass_with_http)
hass_with_http.data.setdefault(DOMAIN, {})
# Store an invalid snapshot type to force the "No/invalid snapshot" branch.
hass_with_http.data[DOMAIN][entry.entry_id] = {ENTRY_LAST_OPTIONS: "invalid"}
hass_with_http.config_entries.async_reload = AsyncMock(return_value=True)
await update_listener(hass_with_http, entry)
entry_data = hass_with_http.data[DOMAIN][entry.entry_id]
assert entry_data[ENTRY_LAST_OPTIONS] == dict(entry.options)
hass_with_http.config_entries.async_reload.assert_awaited_once_with(entry.entry_id)
@pytest.mark.asyncio
async def test_async_unload_entry_pops_runtime_data_on_success(hass_with_http):
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={API_ID: "id", API_KEY: "key"},
)
entry.add_to_hass(hass_with_http)
hass_with_http.data.setdefault(DOMAIN, {})
hass_with_http.data[DOMAIN][entry.entry_id] = {ENTRY_COORDINATOR: object()}
hass_with_http.config_entries.async_unload_platforms = AsyncMock(return_value=True)
ok = await async_unload_entry(hass_with_http, entry)
assert ok is True
assert entry.entry_id not in hass_with_http.data[DOMAIN]
@pytest.mark.asyncio
async def test_async_unload_entry_keeps_runtime_data_on_failure(hass_with_http):
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={API_ID: "id", API_KEY: "key"},
)
entry.add_to_hass(hass_with_http)
hass_with_http.data.setdefault(DOMAIN, {})
hass_with_http.data[DOMAIN][entry.entry_id] = {ENTRY_COORDINATOR: object()}
hass_with_http.config_entries.async_unload_platforms = AsyncMock(return_value=False)
ok = await async_unload_entry(hass_with_http, entry)
assert ok is False
assert entry.entry_id in hass_with_http.data[DOMAIN]
@pytest.mark.asyncio
async def test_received_data_auth_unauthorized_and_incorrect_data_paths(hass):
"""A few lifecycle-adjacent assertions to cover coordinator auth behavior in __init__.py."""
entry = MockConfigEntry(
domain=DOMAIN,
data={},
options={API_ID: "id", API_KEY: "key", WSLINK: False},
)
entry.add_to_hass(hass)
coordinator = WeatherDataUpdateCoordinator(hass, entry)
# Missing security params -> unauthorized
with pytest.raises(HTTPUnauthorized):
await coordinator.received_data(_RequestStub(query={"x": "y"})) # type: ignore[arg-type]
# Wrong credentials -> unauthorized
with pytest.raises(HTTPUnauthorized):
await coordinator.received_data(
_RequestStub(query={"ID": "id", "PASSWORD": "no"})
) # type: ignore[arg-type]
# Missing API_ID in options -> IncorrectDataError
entry2 = MockConfigEntry(
domain=DOMAIN, data={}, options={API_KEY: "key", WSLINK: False}
)
entry2.add_to_hass(hass)
coordinator2 = WeatherDataUpdateCoordinator(hass, entry2)
with pytest.raises(IncorrectDataError):
await coordinator2.received_data(
_RequestStub(query={"ID": "id", "PASSWORD": "key"})
) # type: ignore[arg-type]

302
tests/test_pocasi_push.py Normal file
View File

@ -0,0 +1,302 @@
from __future__ import annotations
from dataclasses import dataclass
from datetime import datetime, timedelta
from types import SimpleNamespace
from typing import Any, Literal
from unittest.mock import AsyncMock, MagicMock
from aiohttp import ClientError
import pytest
from custom_components.sws12500.const import (
DEFAULT_URL,
POCASI_CZ_API_ID,
POCASI_CZ_API_KEY,
POCASI_CZ_ENABLED,
POCASI_CZ_LOGGER_ENABLED,
POCASI_CZ_SEND_INTERVAL,
POCASI_CZ_UNEXPECTED,
POCASI_CZ_URL,
POCASI_INVALID_KEY,
WSLINK_URL,
)
from custom_components.sws12500.pocasti_cz import (
PocasiApiKeyError,
PocasiPush,
PocasiSuccess,
)
@dataclass(slots=True)
class _FakeResponse:
text_value: str
async def text(self) -> str:
return self.text_value
async def __aenter__(self) -> "_FakeResponse":
return self
async def __aexit__(self, exc_type, exc, tb) -> None:
return None
class _FakeSession:
def __init__(
self, *, response: _FakeResponse | None = None, exc: Exception | None = None
):
self._response = response
self._exc = exc
self.calls: list[dict[str, Any]] = []
def get(self, url: str, *, params: dict[str, Any] | None = None):
self.calls.append({"url": url, "params": dict(params or {})})
if self._exc is not None:
raise self._exc
assert self._response is not None
return self._response
def _make_entry(
*,
api_id: str | None = "id",
api_key: str | None = "key",
interval: int = 30,
logger: bool = False,
) -> Any:
options: dict[str, Any] = {
POCASI_CZ_SEND_INTERVAL: interval,
POCASI_CZ_LOGGER_ENABLED: logger,
POCASI_CZ_ENABLED: True,
}
if api_id is not None:
options[POCASI_CZ_API_ID] = api_id
if api_key is not None:
options[POCASI_CZ_API_KEY] = api_key
entry = SimpleNamespace()
entry.options = options
entry.entry_id = "test_entry_id"
return entry
@pytest.fixture
def hass():
# Minimal hass-like object; we patch client session retrieval.
return SimpleNamespace()
@pytest.mark.asyncio
async def test_push_data_to_server_missing_api_id_returns_early(monkeypatch, hass):
entry = _make_entry(api_id=None, api_key="key")
pp = PocasiPush(hass, entry)
session = _FakeSession(response=_FakeResponse("OK"))
monkeypatch.setattr(
"custom_components.sws12500.pocasti_cz.async_get_clientsession",
lambda _h: session,
)
await pp.push_data_to_server({"x": 1}, "WU")
assert session.calls == []
@pytest.mark.asyncio
async def test_push_data_to_server_missing_api_key_returns_early(monkeypatch, hass):
entry = _make_entry(api_id="id", api_key=None)
pp = PocasiPush(hass, entry)
session = _FakeSession(response=_FakeResponse("OK"))
monkeypatch.setattr(
"custom_components.sws12500.pocasti_cz.async_get_clientsession",
lambda _h: session,
)
await pp.push_data_to_server({"x": 1}, "WU")
assert session.calls == []
@pytest.mark.asyncio
async def test_push_data_to_server_respects_interval_limit(monkeypatch, hass):
entry = _make_entry(interval=30, logger=True)
pp = PocasiPush(hass, entry)
# Ensure "next_update > now" so it returns early before doing HTTP.
pp.next_update = datetime.now() + timedelta(seconds=999)
session = _FakeSession(response=_FakeResponse("OK"))
monkeypatch.setattr(
"custom_components.sws12500.pocasti_cz.async_get_clientsession",
lambda _h: session,
)
await pp.push_data_to_server({"x": 1}, "WU")
assert session.calls == []
@pytest.mark.asyncio
@pytest.mark.parametrize(
"mode,expected_path", [("WU", DEFAULT_URL), ("WSLINK", WSLINK_URL)]
)
async def test_push_data_to_server_injects_auth_and_chooses_url(
monkeypatch, hass, mode: Literal["WU", "WSLINK"], expected_path: str
):
entry = _make_entry(api_id="id", api_key="key")
pp = PocasiPush(hass, entry)
# Force send now.
pp.next_update = datetime.now() - timedelta(seconds=1)
session = _FakeSession(response=_FakeResponse("OK"))
monkeypatch.setattr(
"custom_components.sws12500.pocasti_cz.async_get_clientsession",
lambda _h: session,
)
# Avoid depending on anonymize output; just make it deterministic.
monkeypatch.setattr("custom_components.sws12500.pocasti_cz.anonymize", lambda d: d)
await pp.push_data_to_server({"temp": 1}, mode)
assert len(session.calls) == 1
call = session.calls[0]
assert call["url"] == f"{POCASI_CZ_URL}{expected_path}"
params = call["params"]
if mode == "WU":
assert params["ID"] == "id"
assert params["PASSWORD"] == "key"
else:
assert params["wsid"] == "id"
assert params["wspw"] == "key"
@pytest.mark.asyncio
async def test_push_data_to_server_calls_verify_response(monkeypatch, hass):
entry = _make_entry()
pp = PocasiPush(hass, entry)
pp.next_update = datetime.now() - timedelta(seconds=1)
session = _FakeSession(response=_FakeResponse("OK"))
monkeypatch.setattr(
"custom_components.sws12500.pocasti_cz.async_get_clientsession",
lambda _h: session,
)
monkeypatch.setattr("custom_components.sws12500.pocasti_cz.anonymize", lambda d: d)
verify = MagicMock(return_value=None)
monkeypatch.setattr(pp, "verify_response", verify)
await pp.push_data_to_server({"x": 1}, "WU")
verify.assert_called_once_with("OK")
@pytest.mark.asyncio
async def test_push_data_to_server_api_key_error_disables_feature(monkeypatch, hass):
entry = _make_entry()
pp = PocasiPush(hass, entry)
pp.next_update = datetime.now() - timedelta(seconds=1)
session = _FakeSession(response=_FakeResponse("OK"))
monkeypatch.setattr(
"custom_components.sws12500.pocasti_cz.async_get_clientsession",
lambda _h: session,
)
monkeypatch.setattr("custom_components.sws12500.pocasti_cz.anonymize", lambda d: d)
def _raise(_status: str):
raise PocasiApiKeyError
monkeypatch.setattr(pp, "verify_response", _raise)
update_options = AsyncMock(return_value=True)
monkeypatch.setattr(
"custom_components.sws12500.pocasti_cz.update_options", update_options
)
crit = MagicMock()
monkeypatch.setattr("custom_components.sws12500.pocasti_cz._LOGGER.critical", crit)
await pp.push_data_to_server({"x": 1}, "WU")
crit.assert_called()
# Should log invalid key message and disable feature.
assert any(
POCASI_INVALID_KEY in str(c.args[0]) for c in crit.call_args_list if c.args
)
update_options.assert_awaited_once_with(hass, entry, POCASI_CZ_ENABLED, False)
@pytest.mark.asyncio
async def test_push_data_to_server_success_logs_when_logger_enabled(monkeypatch, hass):
entry = _make_entry(logger=True)
pp = PocasiPush(hass, entry)
pp.next_update = datetime.now() - timedelta(seconds=1)
session = _FakeSession(response=_FakeResponse("OK"))
monkeypatch.setattr(
"custom_components.sws12500.pocasti_cz.async_get_clientsession",
lambda _h: session,
)
monkeypatch.setattr("custom_components.sws12500.pocasti_cz.anonymize", lambda d: d)
def _raise_success(_status: str):
raise PocasiSuccess
monkeypatch.setattr(pp, "verify_response", _raise_success)
info = MagicMock()
monkeypatch.setattr("custom_components.sws12500.pocasti_cz._LOGGER.info", info)
await pp.push_data_to_server({"x": 1}, "WU")
info.assert_called()
@pytest.mark.asyncio
async def test_push_data_to_server_client_error_increments_and_disables_after_three(
monkeypatch, hass
):
entry = _make_entry()
pp = PocasiPush(hass, entry)
update_options = AsyncMock(return_value=True)
monkeypatch.setattr(
"custom_components.sws12500.pocasti_cz.update_options", update_options
)
crit = MagicMock()
monkeypatch.setattr("custom_components.sws12500.pocasti_cz._LOGGER.critical", crit)
session = _FakeSession(exc=ClientError("boom"))
monkeypatch.setattr(
"custom_components.sws12500.pocasti_cz.async_get_clientsession",
lambda _h: session,
)
# Force request attempts and exceed invalid count threshold.
for _i in range(4):
pp.next_update = datetime.now() - timedelta(seconds=1)
await pp.push_data_to_server({"x": 1}, "WU")
assert pp.invalid_response_count == 4
# Should disable after >3
update_options.assert_awaited()
args = update_options.await_args.args
assert args[2] == POCASI_CZ_ENABLED
assert args[3] is False
# Should log unexpected at least once
assert any(
POCASI_CZ_UNEXPECTED in str(c.args[0]) for c in crit.call_args_list if c.args
)
def test_verify_response_logs_debug_when_logger_enabled(monkeypatch, hass):
entry = _make_entry(logger=True)
pp = PocasiPush(hass, entry)
dbg = MagicMock()
monkeypatch.setattr("custom_components.sws12500.pocasti_cz._LOGGER.debug", dbg)
assert pp.verify_response("anything") is None
dbg.assert_called()

498
tests/test_received_data.py Normal file
View File

@ -0,0 +1,498 @@
from __future__ import annotations
from dataclasses import dataclass
from types import SimpleNamespace
from typing import Any
from unittest.mock import AsyncMock, MagicMock
from aiohttp.web_exceptions import HTTPUnauthorized
import pytest
from custom_components.sws12500 import IncorrectDataError, WeatherDataUpdateCoordinator
from custom_components.sws12500.const import (
API_ID,
API_KEY,
DEFAULT_URL,
DOMAIN,
POCASI_CZ_ENABLED,
SENSORS_TO_LOAD,
WINDY_ENABLED,
WSLINK,
WSLINK_URL,
)
@dataclass(slots=True)
class _RequestStub:
"""Minimal aiohttp Request stub.
The coordinator uses `webdata.query` and `await webdata.post()`.
"""
query: dict[str, Any]
post_data: dict[str, Any] | None = None
async def post(self) -> dict[str, Any]:
return self.post_data or {}
def _make_entry(
*,
wslink: bool = False,
api_id: str | None = "id",
api_key: str | None = "key",
windy_enabled: bool = False,
pocasi_enabled: bool = False,
dev_debug: bool = False,
) -> Any:
"""Create a minimal config entry stub with `.options` and `.entry_id`."""
options: dict[str, Any] = {
WSLINK: wslink,
WINDY_ENABLED: windy_enabled,
POCASI_CZ_ENABLED: pocasi_enabled,
"dev_debug_checkbox": dev_debug,
}
if api_id is not None:
options[API_ID] = api_id
if api_key is not None:
options[API_KEY] = api_key
entry = SimpleNamespace()
entry.entry_id = "test_entry_id"
entry.options = options
return entry
@pytest.mark.asyncio
async def test_received_data_wu_missing_security_params_raises_http_unauthorized(
hass, monkeypatch
):
entry = _make_entry(wslink=False)
coordinator = WeatherDataUpdateCoordinator(hass, entry)
# No ID/PASSWORD -> unauthorized
request = _RequestStub(query={"foo": "bar"})
with pytest.raises(HTTPUnauthorized):
await coordinator.received_data(request) # type: ignore[arg-type]
@pytest.mark.asyncio
async def test_received_data_wslink_missing_security_params_raises_http_unauthorized(
hass, monkeypatch
):
entry = _make_entry(wslink=True)
coordinator = WeatherDataUpdateCoordinator(hass, entry)
# No wsid/wspw -> unauthorized
request = _RequestStub(query={"foo": "bar"})
with pytest.raises(HTTPUnauthorized):
await coordinator.received_data(request) # type: ignore[arg-type]
@pytest.mark.asyncio
async def test_received_data_missing_api_id_in_options_raises_incorrect_data_error(
hass, monkeypatch
):
entry = _make_entry(wslink=False, api_id=None, api_key="key")
coordinator = WeatherDataUpdateCoordinator(hass, entry)
request = _RequestStub(query={"ID": "id", "PASSWORD": "key"})
with pytest.raises(IncorrectDataError):
await coordinator.received_data(request) # type: ignore[arg-type]
@pytest.mark.asyncio
async def test_received_data_missing_api_key_in_options_raises_incorrect_data_error(
hass, monkeypatch
):
entry = _make_entry(wslink=False, api_id="id", api_key=None)
coordinator = WeatherDataUpdateCoordinator(hass, entry)
request = _RequestStub(query={"ID": "id", "PASSWORD": "key"})
with pytest.raises(IncorrectDataError):
await coordinator.received_data(request) # type: ignore[arg-type]
@pytest.mark.asyncio
async def test_received_data_wrong_credentials_raises_http_unauthorized(
hass, monkeypatch
):
entry = _make_entry(wslink=False, api_id="id", api_key="key")
coordinator = WeatherDataUpdateCoordinator(hass, entry)
request = _RequestStub(query={"ID": "id", "PASSWORD": "wrong"})
with pytest.raises(HTTPUnauthorized):
await coordinator.received_data(request) # type: ignore[arg-type]
@pytest.mark.asyncio
async def test_received_data_success_remaps_and_updates_coordinator_data(
hass, monkeypatch
):
entry = _make_entry(wslink=False, api_id="id", api_key="key")
coordinator = WeatherDataUpdateCoordinator(hass, entry)
# Patch remapping so this test doesn't depend on mapping tables.
remapped = {"outside_temp": "10"}
monkeypatch.setattr(
"custom_components.sws12500.remap_items",
lambda _data: remapped,
)
# Ensure no autodiscovery triggers
monkeypatch.setattr(
"custom_components.sws12500.check_disabled",
lambda _remaped_items, _config: [],
)
# Capture updates
coordinator.async_set_updated_data = MagicMock()
request = _RequestStub(query={"ID": "id", "PASSWORD": "key", "tempf": "50"})
resp = await coordinator.received_data(request) # type: ignore[arg-type]
assert resp.status == 200
coordinator.async_set_updated_data.assert_called_once_with(remapped)
@pytest.mark.asyncio
async def test_received_data_success_wslink_uses_wslink_remap(hass, monkeypatch):
entry = _make_entry(wslink=True, api_id="id", api_key="key")
coordinator = WeatherDataUpdateCoordinator(hass, entry)
remapped = {"ws_temp": "1"}
monkeypatch.setattr(
"custom_components.sws12500.remap_wslink_items",
lambda _data: remapped,
)
# If the wrong remapper is used, we'd crash because we won't patch it:
monkeypatch.setattr(
"custom_components.sws12500.check_disabled",
lambda _remaped_items, _config: [],
)
coordinator.async_set_updated_data = MagicMock()
request = _RequestStub(query={"wsid": "id", "wspw": "key", "t": "1"})
resp = await coordinator.received_data(request) # type: ignore[arg-type]
assert resp.status == 200
coordinator.async_set_updated_data.assert_called_once_with(remapped)
@pytest.mark.asyncio
async def test_received_data_forwards_to_windy_when_enabled(hass, monkeypatch):
entry = _make_entry(wslink=False, api_id="id", api_key="key", windy_enabled=True)
coordinator = WeatherDataUpdateCoordinator(hass, entry)
coordinator.windy.push_data_to_windy = AsyncMock()
monkeypatch.setattr(
"custom_components.sws12500.remap_items",
lambda _data: {"k": "v"},
)
monkeypatch.setattr(
"custom_components.sws12500.check_disabled",
lambda _remaped_items, _config: [],
)
coordinator.async_set_updated_data = MagicMock()
request = _RequestStub(query={"ID": "id", "PASSWORD": "key", "x": "y"})
resp = await coordinator.received_data(request) # type: ignore[arg-type]
assert resp.status == 200
coordinator.windy.push_data_to_windy.assert_awaited_once()
args, _kwargs = coordinator.windy.push_data_to_windy.await_args
assert isinstance(args[0], dict) # raw data dict
assert args[1] is False # wslink flag
@pytest.mark.asyncio
async def test_received_data_forwards_to_pocasi_when_enabled(hass, monkeypatch):
entry = _make_entry(wslink=True, api_id="id", api_key="key", pocasi_enabled=True)
coordinator = WeatherDataUpdateCoordinator(hass, entry)
coordinator.pocasi.push_data_to_server = AsyncMock()
monkeypatch.setattr(
"custom_components.sws12500.remap_wslink_items",
lambda _data: {"k": "v"},
)
monkeypatch.setattr(
"custom_components.sws12500.check_disabled",
lambda _remaped_items, _config: [],
)
coordinator.async_set_updated_data = MagicMock()
request = _RequestStub(query={"wsid": "id", "wspw": "key", "x": "y"})
resp = await coordinator.received_data(request) # type: ignore[arg-type]
assert resp.status == 200
coordinator.pocasi.push_data_to_server.assert_awaited_once()
args, _kwargs = coordinator.pocasi.push_data_to_server.await_args
assert isinstance(args[0], dict) # raw data dict
assert args[1] == "WSLINK"
@pytest.mark.asyncio
async def test_received_data_autodiscovery_updates_options_notifies_and_adds_sensors(
hass,
monkeypatch,
):
entry = _make_entry(wslink=False, api_id="id", api_key="key")
coordinator = WeatherDataUpdateCoordinator(hass, entry)
# Arrange: remapped payload contains keys that are disabled.
remapped = {"a": "1", "b": "2"}
monkeypatch.setattr("custom_components.sws12500.remap_items", lambda _d: remapped)
# Autodiscovery finds two sensors to add
monkeypatch.setattr(
"custom_components.sws12500.check_disabled",
lambda _remaped_items, _config: ["a", "b"],
)
# No previously loaded sensors
monkeypatch.setattr("custom_components.sws12500.loaded_sensors", lambda _c: [])
# translations returns a friendly name for each sensor key
async def _translations(_hass, _domain, _key, **_kwargs):
# return something non-None so it's included in human readable string
return "Name"
monkeypatch.setattr("custom_components.sws12500.translations", _translations)
translated_notification = AsyncMock()
monkeypatch.setattr(
"custom_components.sws12500.translated_notification", translated_notification
)
update_options = AsyncMock()
monkeypatch.setattr("custom_components.sws12500.update_options", update_options)
add_new_sensors = MagicMock()
monkeypatch.setattr(
"custom_components.sws12500.sensor.add_new_sensors", add_new_sensors
)
coordinator.async_set_updated_data = MagicMock()
request = _RequestStub(query={"ID": "id", "PASSWORD": "key"})
resp = await coordinator.received_data(request) # type: ignore[arg-type]
assert resp.status == 200
# It should notify
translated_notification.assert_awaited()
# It should persist newly discovered sensors
update_options.assert_awaited()
args, _kwargs = update_options.await_args
assert args[2] == SENSORS_TO_LOAD
assert set(args[3]) >= {"a", "b"}
# It should add new sensors dynamically
add_new_sensors.assert_called_once()
_hass_arg, _entry_arg, keys = add_new_sensors.call_args.args
assert _hass_arg is hass
assert _entry_arg is entry
assert set(keys) == {"a", "b"}
coordinator.async_set_updated_data.assert_called_once_with(remapped)
@pytest.mark.asyncio
async def test_received_data_autodiscovery_human_readable_empty_branch_via_checked_none(
hass,
monkeypatch,
):
"""Force `checked([...], list[str])` to return None so `human_readable = ""` branch is executed."""
entry = _make_entry(wslink=False, api_id="id", api_key="key")
coordinator = WeatherDataUpdateCoordinator(hass, entry)
remapped = {"a": "1"}
monkeypatch.setattr("custom_components.sws12500.remap_items", lambda _d: remapped)
monkeypatch.setattr(
"custom_components.sws12500.check_disabled",
lambda _remaped_items, _config: ["a"],
)
monkeypatch.setattr("custom_components.sws12500.loaded_sensors", lambda _c: [])
# Return a translation so the list comprehension would normally include an item.
async def _translations(_hass, _domain, _key, **_kwargs):
return "Name"
monkeypatch.setattr("custom_components.sws12500.translations", _translations)
# Force checked(...) to return None when the code tries to validate translate_sensors as list[str].
def _checked_override(value, expected_type):
if expected_type == list[str]:
return None
return value
monkeypatch.setattr("custom_components.sws12500.checked", _checked_override)
translated_notification = AsyncMock()
monkeypatch.setattr(
"custom_components.sws12500.translated_notification", translated_notification
)
update_options = AsyncMock()
monkeypatch.setattr("custom_components.sws12500.update_options", update_options)
add_new_sensors = MagicMock()
monkeypatch.setattr(
"custom_components.sws12500.sensor.add_new_sensors", add_new_sensors
)
coordinator.async_set_updated_data = MagicMock()
request = _RequestStub(query={"ID": "id", "PASSWORD": "key"})
resp = await coordinator.received_data(request) # type: ignore[arg-type]
assert resp.status == 200
# Ensure it still notifies (with empty human readable list)
translated_notification.assert_awaited()
# And persists sensors
update_options.assert_awaited()
coordinator.async_set_updated_data.assert_called_once_with(remapped)
@pytest.mark.asyncio
async def test_received_data_autodiscovery_extends_with_loaded_sensors_branch(
hass, monkeypatch
):
"""Cover `_loaded_sensors := loaded_sensors(self.config)` branch (extend existing)."""
entry = _make_entry(wslink=False, api_id="id", api_key="key")
coordinator = WeatherDataUpdateCoordinator(hass, entry)
remapped = {"new": "1"}
monkeypatch.setattr("custom_components.sws12500.remap_items", lambda _d: remapped)
# Autodiscovery finds one new sensor
monkeypatch.setattr(
"custom_components.sws12500.check_disabled",
lambda _remaped_items, _config: ["new"],
)
# Pretend there are already loaded sensors in options
monkeypatch.setattr(
"custom_components.sws12500.loaded_sensors", lambda _c: ["existing"]
)
async def _translations(_hass, _domain, _key, **_kwargs):
return "Name"
monkeypatch.setattr("custom_components.sws12500.translations", _translations)
monkeypatch.setattr(
"custom_components.sws12500.translated_notification", AsyncMock()
)
update_options = AsyncMock()
monkeypatch.setattr("custom_components.sws12500.update_options", update_options)
monkeypatch.setattr(
"custom_components.sws12500.sensor.add_new_sensors", MagicMock()
)
coordinator.async_set_updated_data = MagicMock()
resp = await coordinator.received_data(
_RequestStub(query={"ID": "id", "PASSWORD": "key"})
) # type: ignore[arg-type]
assert resp.status == 200
# Ensure the persisted list includes both new and existing sensors
update_options.assert_awaited()
args, _kwargs = update_options.await_args
assert args[2] == SENSORS_TO_LOAD
assert set(args[3]) >= {"new", "existing"}
@pytest.mark.asyncio
async def test_received_data_autodiscovery_translations_all_none_still_notifies_and_updates(
hass, monkeypatch
):
"""Cover the branch where translated sensor names cannot be resolved (human_readable becomes empty)."""
entry = _make_entry(wslink=False, api_id="id", api_key="key")
coordinator = WeatherDataUpdateCoordinator(hass, entry)
remapped = {"a": "1"}
monkeypatch.setattr("custom_components.sws12500.remap_items", lambda _d: remapped)
monkeypatch.setattr(
"custom_components.sws12500.check_disabled",
lambda _remaped_items, _config: ["a"],
)
monkeypatch.setattr("custom_components.sws12500.loaded_sensors", lambda _c: [])
# Force translations to return None for every lookup -> translate_sensors becomes None and human_readable ""
async def _translations(_hass, _domain, _key, **_kwargs):
return None
monkeypatch.setattr("custom_components.sws12500.translations", _translations)
translated_notification = AsyncMock()
monkeypatch.setattr(
"custom_components.sws12500.translated_notification", translated_notification
)
update_options = AsyncMock()
monkeypatch.setattr("custom_components.sws12500.update_options", update_options)
add_new_sensors = MagicMock()
monkeypatch.setattr(
"custom_components.sws12500.sensor.add_new_sensors", add_new_sensors
)
coordinator.async_set_updated_data = MagicMock()
resp = await coordinator.received_data(
_RequestStub(query={"ID": "id", "PASSWORD": "key"})
) # type: ignore[arg-type]
assert resp.status == 200
translated_notification.assert_awaited()
update_options.assert_awaited()
add_new_sensors.assert_called_once()
coordinator.async_set_updated_data.assert_called_once_with(remapped)
@pytest.mark.asyncio
async def test_received_data_dev_logging_calls_anonymize_and_logs(hass, monkeypatch):
entry = _make_entry(wslink=False, api_id="id", api_key="key", dev_debug=True)
coordinator = WeatherDataUpdateCoordinator(hass, entry)
monkeypatch.setattr("custom_components.sws12500.remap_items", lambda _d: {"k": "v"})
monkeypatch.setattr(
"custom_components.sws12500.check_disabled",
lambda _remaped_items, _config: [],
)
anonymize = MagicMock(return_value={"safe": True})
monkeypatch.setattr("custom_components.sws12500.anonymize", anonymize)
log_info = MagicMock()
monkeypatch.setattr("custom_components.sws12500._LOGGER.info", log_info)
coordinator.async_set_updated_data = MagicMock()
request = _RequestStub(query={"ID": "id", "PASSWORD": "key", "x": "y"})
resp = await coordinator.received_data(request) # type: ignore[arg-type]
assert resp.status == 200
anonymize.assert_called_once()
log_info.assert_called_once()
@pytest.mark.asyncio
async def test_register_path_switching_logic_is_exercised_via_routes(monkeypatch):
"""Sanity: constants exist and are distinct (helps guard tests relying on them)."""
assert DEFAULT_URL != WSLINK_URL
assert DOMAIN == "sws12500"

105
tests/test_routes.py Normal file
View File

@ -0,0 +1,105 @@
from __future__ import annotations
from dataclasses import dataclass
from typing import Awaitable, Callable
from aiohttp.web import Response
import pytest
from custom_components.sws12500.routes import Routes, unregistered
Handler = Callable[["_RequestStub"], Awaitable[Response]]
@dataclass(slots=True)
class _RequestStub:
"""Minimal request stub for unit-testing the dispatcher.
`Routes.dispatch` relies on `request.method` and `request.path`.
`unregistered` accepts a request object but does not use it.
"""
method: str
path: str
@dataclass(slots=True)
class _RouteStub:
"""Minimal route stub providing `method` expected by Routes.add_route`."""
method: str
@pytest.fixture
def routes() -> Routes:
return Routes()
async def test_dispatch_unknown_path_calls_unregistered(routes: Routes) -> None:
request = _RequestStub(method="GET", path="/unregistered")
response = await routes.dispatch(request) # type: ignore[arg-type]
assert response.status == 400
async def test_unregistered_handler_returns_400() -> None:
request = _RequestStub(method="GET", path="/invalid")
response = await unregistered(request) # type: ignore[arg-type]
assert response.status == 400
async def test_dispatch_registered_but_disabled_uses_fallback(routes: Routes) -> None:
async def handler(_request: _RequestStub) -> Response:
return Response(text="OK", status=200)
routes.add_route("/a", _RouteStub(method="GET"), handler, enabled=False)
response = await routes.dispatch(_RequestStub(method="GET", path="/a")) # type: ignore[arg-type]
assert response.status == 400
async def test_dispatch_registered_and_enabled_uses_handler(routes: Routes) -> None:
async def handler(_request: _RequestStub) -> Response:
return Response(text="OK", status=201)
routes.add_route("/a", _RouteStub(method="GET"), handler, enabled=True)
response = await routes.dispatch(_RequestStub(method="GET", path="/a")) # type: ignore[arg-type]
assert response.status == 201
def test_switch_route_enables_exactly_one(routes: Routes) -> None:
async def handler_a(_request: _RequestStub) -> Response:
return Response(text="A", status=200)
async def handler_b(_request: _RequestStub) -> Response:
return Response(text="B", status=200)
routes.add_route("/a", _RouteStub(method="GET"), handler_a, enabled=True)
routes.add_route("/b", _RouteStub(method="GET"), handler_b, enabled=False)
routes.switch_route(handler_b, "/b")
assert routes.routes["GET:/a"].enabled is False
assert routes.routes["GET:/b"].enabled is True
def test_show_enabled_returns_message_when_none_enabled(routes: Routes) -> None:
async def handler(_request: _RequestStub) -> Response:
return Response(text="OK", status=200)
routes.add_route("/a", _RouteStub(method="GET"), handler, enabled=False)
routes.add_route("/b", _RouteStub(method="GET"), handler, enabled=False)
assert routes.show_enabled() == "No routes are enabled."
def test_show_enabled_includes_url_when_enabled(routes: Routes) -> None:
async def handler(_request: _RequestStub) -> Response:
return Response(text="OK", status=200)
routes.add_route("/a", _RouteStub(method="GET"), handler, enabled=False)
routes.add_route("/b", _RouteStub(method="GET"), handler, enabled=True)
msg = routes.show_enabled()
assert "Dispatcher enabled for (GET):/b" in msg
assert "handler" in msg

View File

@ -0,0 +1,282 @@
from __future__ import annotations
from dataclasses import dataclass
from typing import Any
from unittest.mock import MagicMock
import pytest
from custom_components.sws12500.const import (
CHILL_INDEX,
HEAT_INDEX,
OUTSIDE_HUMIDITY,
OUTSIDE_TEMP,
SENSORS_TO_LOAD,
WIND_AZIMUT,
WIND_DIR,
WIND_SPEED,
WSLINK,
)
from custom_components.sws12500.data import (
ENTRY_ADD_ENTITIES,
ENTRY_COORDINATOR,
ENTRY_DESCRIPTIONS,
)
from custom_components.sws12500.sensor import (
WeatherSensor,
_auto_enable_derived_sensors,
add_new_sensors,
async_setup_entry,
)
from custom_components.sws12500.sensors_weather import SENSOR_TYPES_WEATHER_API
from custom_components.sws12500.sensors_wslink import SENSOR_TYPES_WSLINK
@dataclass(slots=True)
class _ConfigEntryStub:
entry_id: str
options: dict[str, Any]
class _CoordinatorStub:
"""Minimal coordinator stub for WeatherSensor and platform setup."""
def __init__(
self, data: dict[str, Any] | None = None, *, config: Any | None = None
) -> None:
self.data = data if data is not None else {}
self.config = config
@pytest.fixture
def hass():
# Use a very small hass-like object; sensor platform uses only `hass.data`.
class _Hass:
def __init__(self) -> None:
self.data: dict[str, Any] = {}
return _Hass()
@pytest.fixture
def config_entry() -> _ConfigEntryStub:
return _ConfigEntryStub(entry_id="test_entry_id", options={})
def _capture_add_entities():
captured: list[Any] = []
def _add_entities(entities: list[Any]) -> None:
captured.extend(entities)
return captured, _add_entities
def test_auto_enable_derived_sensors_wind_azimut():
requested = {WIND_DIR}
expanded = _auto_enable_derived_sensors(requested)
assert WIND_DIR in expanded
assert WIND_AZIMUT in expanded
def test_auto_enable_derived_sensors_heat_index():
requested = {OUTSIDE_TEMP, OUTSIDE_HUMIDITY}
expanded = _auto_enable_derived_sensors(requested)
assert HEAT_INDEX in expanded
def test_auto_enable_derived_sensors_chill_index():
requested = {OUTSIDE_TEMP, WIND_SPEED}
expanded = _auto_enable_derived_sensors(requested)
assert CHILL_INDEX in expanded
@pytest.mark.asyncio
async def test_sensor_async_setup_entry_no_coordinator_is_noop(hass, config_entry):
# No entry dict created by integration yet; async_setup_entry should be defensive and no-op.
captured, add_entities = _capture_add_entities()
await async_setup_entry(hass, config_entry, add_entities)
assert captured == []
@pytest.mark.asyncio
async def test_sensor_async_setup_entry_stores_callback_and_descriptions_even_if_no_sensors_to_load(
hass, config_entry
):
# Prepare runtime entry data and coordinator like integration does.
hass.data.setdefault("sws12500", {})
hass.data["sws12500"][config_entry.entry_id] = {
ENTRY_COORDINATOR: _CoordinatorStub()
}
captured, add_entities = _capture_add_entities()
# No SENSORS_TO_LOAD set -> early return, but it should still store callback + descriptions.
await async_setup_entry(hass, config_entry, add_entities)
entry_data = hass.data["sws12500"][config_entry.entry_id]
assert entry_data[ENTRY_ADD_ENTITIES] is add_entities
assert isinstance(entry_data[ENTRY_DESCRIPTIONS], dict)
assert captured == []
@pytest.mark.asyncio
async def test_sensor_async_setup_entry_selects_weather_api_descriptions_when_wslink_disabled(
hass, config_entry
):
hass.data.setdefault("sws12500", {})
hass.data["sws12500"][config_entry.entry_id] = {
ENTRY_COORDINATOR: _CoordinatorStub()
}
captured, add_entities = _capture_add_entities()
# Explicitly disabled WSLINK
config_entry.options[WSLINK] = False
await async_setup_entry(hass, config_entry, add_entities)
descriptions = hass.data["sws12500"][config_entry.entry_id][ENTRY_DESCRIPTIONS]
assert set(descriptions.keys()) == {d.key for d in SENSOR_TYPES_WEATHER_API}
assert captured == []
@pytest.mark.asyncio
async def test_sensor_async_setup_entry_selects_wslink_descriptions_when_wslink_enabled(
hass, config_entry
):
hass.data.setdefault("sws12500", {})
hass.data["sws12500"][config_entry.entry_id] = {
ENTRY_COORDINATOR: _CoordinatorStub()
}
captured, add_entities = _capture_add_entities()
config_entry.options[WSLINK] = True
await async_setup_entry(hass, config_entry, add_entities)
descriptions = hass.data["sws12500"][config_entry.entry_id][ENTRY_DESCRIPTIONS]
assert set(descriptions.keys()) == {d.key for d in SENSOR_TYPES_WSLINK}
assert captured == []
@pytest.mark.asyncio
async def test_sensor_async_setup_entry_adds_requested_entities_and_auto_enables_derived(
hass, config_entry
):
hass.data.setdefault("sws12500", {})
coordinator = _CoordinatorStub()
hass.data["sws12500"][config_entry.entry_id] = {ENTRY_COORDINATOR: coordinator}
captured, add_entities = _capture_add_entities()
# Request WIND_DIR, OUTSIDE_TEMP, OUTSIDE_HUMIDITY, WIND_SPEED -> should auto-add derived keys too.
config_entry.options[WSLINK] = False
config_entry.options[SENSORS_TO_LOAD] = [
WIND_DIR,
OUTSIDE_TEMP,
OUTSIDE_HUMIDITY,
WIND_SPEED,
]
await async_setup_entry(hass, config_entry, add_entities)
# We should have at least those requested + derived in the added entities.
keys_added = {
e.entity_description.key for e in captured if isinstance(e, WeatherSensor)
}
assert WIND_DIR in keys_added
assert OUTSIDE_TEMP in keys_added
assert OUTSIDE_HUMIDITY in keys_added
assert WIND_SPEED in keys_added
# Derived:
assert WIND_AZIMUT in keys_added
assert HEAT_INDEX in keys_added
assert CHILL_INDEX in keys_added
def test_add_new_sensors_is_noop_when_domain_missing(hass, config_entry):
called = False
def add_entities(_entities: list[Any]) -> None:
nonlocal called
called = True
# No hass.data["sws12500"] at all.
add_new_sensors(hass, config_entry, keys=["anything"])
assert called is False
def test_add_new_sensors_is_noop_when_entry_missing(hass, config_entry):
hass.data["sws12500"] = {}
called = False
def add_entities(_entities: list[Any]) -> None:
nonlocal called
called = True
add_new_sensors(hass, config_entry, keys=["anything"])
assert called is False
def test_add_new_sensors_is_noop_when_callback_or_descriptions_missing(
hass, config_entry
):
hass.data["sws12500"] = {
config_entry.entry_id: {ENTRY_COORDINATOR: _CoordinatorStub()}
}
called = False
def add_entities(_entities: list[Any]) -> None:
nonlocal called
called = True
# Missing ENTRY_ADD_ENTITIES + ENTRY_DESCRIPTIONS -> no-op.
add_new_sensors(hass, config_entry, keys=["anything"])
assert called is False
def test_add_new_sensors_ignores_unknown_keys(hass, config_entry):
hass.data["sws12500"] = {
config_entry.entry_id: {
ENTRY_COORDINATOR: _CoordinatorStub(),
ENTRY_ADD_ENTITIES: MagicMock(),
ENTRY_DESCRIPTIONS: {}, # nothing known
}
}
add_new_sensors(hass, config_entry, keys=["unknown_key"])
hass.data["sws12500"][config_entry.entry_id][ENTRY_ADD_ENTITIES].assert_not_called()
def test_add_new_sensors_adds_known_keys(hass, config_entry):
coordinator = _CoordinatorStub()
add_entities = MagicMock()
# Use one known description from the weather API list.
known_desc = SENSOR_TYPES_WEATHER_API[0]
hass.data["sws12500"] = {
config_entry.entry_id: {
ENTRY_COORDINATOR: coordinator,
ENTRY_ADD_ENTITIES: add_entities,
ENTRY_DESCRIPTIONS: {known_desc.key: known_desc},
}
}
add_new_sensors(hass, config_entry, keys=[known_desc.key])
add_entities.assert_called_once()
(entities_arg,) = add_entities.call_args.args
assert isinstance(entities_arg, list)
assert len(entities_arg) == 1
assert isinstance(entities_arg[0], WeatherSensor)
assert entities_arg[0].entity_description.key == known_desc.key

View File

@ -0,0 +1,6 @@
# Test file for sensors_common.py module
def test_sensors_common_functionality():
# Add your test cases here
pass

View File

@ -0,0 +1,6 @@
# Test file for sensors_weather.py module
def test_sensors_weather_functionality():
# Add your test cases here
pass

View File

@ -0,0 +1,10 @@
from custom_components.sws12500.sensors_wslink import SENSOR_TYPES_WSLINK
import pytest
def test_sensor_types_wslink_structure():
assert isinstance(SENSOR_TYPES_WSLINK, tuple)
assert len(SENSOR_TYPES_WSLINK) > 0
for sensor in SENSOR_TYPES_WSLINK:
assert hasattr(sensor, "key")
assert hasattr(sensor, "native_unit_of_measurement")

6
tests/test_strings.py Normal file
View File

@ -0,0 +1,6 @@
# Test file for strings.json module
def test_strings_functionality():
# Add your test cases here
pass

View File

@ -0,0 +1,6 @@
# Test file for translations/cs.json module
def test_translations_cs_functionality():
# Add your test cases here
pass

View File

@ -0,0 +1,6 @@
# Test file for translations/en.json module
def test_translations_en_functionality():
# Add your test cases here
pass

8
tests/test_utils.py Normal file
View File

@ -0,0 +1,8 @@
from custom_components.sws12500.utils import celsius_to_fahrenheit, fahrenheit_to_celsius
def test_temperature_conversion():
assert celsius_to_fahrenheit(0) == 32
assert celsius_to_fahrenheit(100) == 212
assert fahrenheit_to_celsius(32) == 0
assert fahrenheit_to_celsius(212) == 100

364
tests/test_utils_more.py Normal file
View File

@ -0,0 +1,364 @@
from __future__ import annotations
from dataclasses import dataclass
from types import SimpleNamespace
from typing import Any
from unittest.mock import AsyncMock, MagicMock
import pytest
from custom_components.sws12500.const import (
DEV_DBG,
OUTSIDE_HUMIDITY,
OUTSIDE_TEMP,
REMAP_ITEMS,
REMAP_WSLINK_ITEMS,
SENSORS_TO_LOAD,
WIND_SPEED,
UnitOfBat,
)
from custom_components.sws12500.utils import (
anonymize,
battery_level,
battery_level_to_icon,
celsius_to_fahrenheit,
check_disabled,
chill_index,
fahrenheit_to_celsius,
heat_index,
loaded_sensors,
remap_items,
remap_wslink_items,
translated_notification,
translations,
update_options,
wind_dir_to_text,
)
@dataclass(slots=True)
class _EntryStub:
entry_id: str = "test_entry_id"
options: dict[str, Any] = None # type: ignore[assignment]
class _ConfigEntriesStub:
def __init__(self) -> None:
self.async_update_entry = MagicMock(return_value=True)
class _HassStub:
def __init__(self, language: str = "en") -> None:
self.config = SimpleNamespace(language=language)
self.config_entries = _ConfigEntriesStub()
@pytest.fixture
def hass() -> _HassStub:
return _HassStub(language="en")
@pytest.fixture
def entry() -> _EntryStub:
return _EntryStub(options={})
def test_anonymize_masks_secrets_and_keeps_other_values():
data = {
"ID": "abc",
"PASSWORD": "secret",
"wsid": "id2",
"wspw": "pw2",
"temp": 10,
"ok": True,
}
out = anonymize(data)
assert out["ID"] == "***"
assert out["PASSWORD"] == "***"
assert out["wsid"] == "***"
assert out["wspw"] == "***"
assert out["temp"] == 10
assert out["ok"] is True
def test_remap_items_filters_unknown_keys():
# Pick a known legacy key from the mapping
legacy_key = next(iter(REMAP_ITEMS.keys()))
internal_key = REMAP_ITEMS[legacy_key]
entities = {legacy_key: "1", "unknown": "2"}
out = remap_items(entities)
assert out == {internal_key: "1"}
def test_remap_wslink_items_filters_unknown_keys():
wslink_key = next(iter(REMAP_WSLINK_ITEMS.keys()))
internal_key = REMAP_WSLINK_ITEMS[wslink_key]
entities = {wslink_key: "x", "unknown": "y"}
out = remap_wslink_items(entities)
assert out == {internal_key: "x"}
def test_loaded_sensors_returns_list_or_empty(entry: _EntryStub):
entry.options[SENSORS_TO_LOAD] = ["a", "b"]
assert loaded_sensors(entry) == ["a", "b"]
entry.options[SENSORS_TO_LOAD] = []
assert loaded_sensors(entry) == []
entry.options.pop(SENSORS_TO_LOAD)
assert loaded_sensors(entry) == []
def test_check_disabled_returns_none_when_all_present(entry: _EntryStub):
entry.options[SENSORS_TO_LOAD] = ["a", "b"]
entry.options[DEV_DBG] = False
missing = check_disabled({"a": "1", "b": "2"}, entry)
assert missing is None
def test_check_disabled_returns_missing_keys(entry: _EntryStub):
entry.options[SENSORS_TO_LOAD] = ["a"]
entry.options[DEV_DBG] = False
missing = check_disabled({"a": "1", "b": "2", "c": "3"}, entry)
assert missing == ["b", "c"]
def test_check_disabled_logs_when_dev_dbg_enabled(entry: _EntryStub, monkeypatch):
# Just ensure logging branches are exercised without asserting exact messages.
entry.options[SENSORS_TO_LOAD] = []
entry.options[DEV_DBG] = True
monkeypatch.setattr(
"custom_components.sws12500.utils._LOGGER.info", lambda *a, **k: None
)
missing = check_disabled({"a": "1"}, entry)
assert missing == ["a"]
@pytest.mark.asyncio
async def test_update_options_calls_async_update_entry(
hass: _HassStub, entry: _EntryStub
):
entry.options = {"x": 1}
ok = await update_options(hass, entry, "y", True)
assert ok is True
hass.config_entries.async_update_entry.assert_called_once()
_called_entry = hass.config_entries.async_update_entry.call_args.args[0]
assert _called_entry is entry
called_options = hass.config_entries.async_update_entry.call_args.kwargs["options"]
assert called_options["x"] == 1
assert called_options["y"] is True
@pytest.mark.asyncio
async def test_translations_returns_value_when_key_present(
hass: _HassStub, monkeypatch
):
# Build the key that translations() will look for
localize_key = "component.sws12500.entity.sensor.test.name"
get_translations = AsyncMock(return_value={localize_key: "Translated"})
monkeypatch.setattr(
"custom_components.sws12500.utils.async_get_translations", get_translations
)
out = await translations(
hass,
"sws12500",
"sensor.test",
key="name",
category="entity",
)
assert out == "Translated"
@pytest.mark.asyncio
async def test_translations_returns_none_when_key_missing(hass: _HassStub, monkeypatch):
get_translations = AsyncMock(return_value={})
monkeypatch.setattr(
"custom_components.sws12500.utils.async_get_translations", get_translations
)
out = await translations(hass, "sws12500", "missing")
assert out is None
@pytest.mark.asyncio
async def test_translated_notification_creates_notification_without_placeholders(
hass: _HassStub, monkeypatch
):
base_key = "component.sws12500.notify.added.message"
title_key = "component.sws12500.notify.added.title"
get_translations = AsyncMock(return_value={base_key: "Msg", title_key: "Title"})
monkeypatch.setattr(
"custom_components.sws12500.utils.async_get_translations", get_translations
)
create = MagicMock()
monkeypatch.setattr(
"custom_components.sws12500.utils.persistent_notification.async_create", create
)
await translated_notification(hass, "sws12500", "added")
create.assert_called_once()
args = create.call_args.args
assert args[0] is hass
assert args[1] == "Msg"
assert args[2] == "Title"
@pytest.mark.asyncio
async def test_translated_notification_formats_placeholders(
hass: _HassStub, monkeypatch
):
base_key = "component.sws12500.notify.added.message"
title_key = "component.sws12500.notify.added.title"
get_translations = AsyncMock(
return_value={base_key: "Hello {name}", title_key: "Title"}
)
monkeypatch.setattr(
"custom_components.sws12500.utils.async_get_translations", get_translations
)
create = MagicMock()
monkeypatch.setattr(
"custom_components.sws12500.utils.persistent_notification.async_create", create
)
await translated_notification(
hass, "sws12500", "added", translation_placeholders={"name": "World"}
)
create.assert_called_once()
assert create.call_args.args[1] == "Hello World"
def test_battery_level_handles_none_empty_invalid_and_known_values():
assert battery_level(None) == UnitOfBat.UNKNOWN
assert battery_level("") == UnitOfBat.UNKNOWN
assert battery_level("x") == UnitOfBat.UNKNOWN
assert battery_level(0) == UnitOfBat.LOW
assert battery_level("0") == UnitOfBat.LOW
assert battery_level(1) == UnitOfBat.NORMAL
assert battery_level("1") == UnitOfBat.NORMAL
# Unknown numeric values map to UNKNOWN
assert battery_level(2) == UnitOfBat.UNKNOWN
assert battery_level("2") == UnitOfBat.UNKNOWN
def test_battery_level_to_icon_maps_all_and_unknown():
assert battery_level_to_icon(UnitOfBat.LOW) == "mdi:battery-low"
assert battery_level_to_icon(UnitOfBat.NORMAL) == "mdi:battery"
assert battery_level_to_icon(UnitOfBat.UNKNOWN) == "mdi:battery-unknown"
def test_temperature_conversions_round_trip():
# Use a value that is exactly representable in binary-ish floats
f = 32.0
c = fahrenheit_to_celsius(f)
assert c == 0.0
assert celsius_to_fahrenheit(c) == 32.0
# General check (approx)
f2 = 77.0
c2 = fahrenheit_to_celsius(f2)
assert c2 == pytest.approx(25.0)
assert celsius_to_fahrenheit(c2) == pytest.approx(77.0)
def test_wind_dir_to_text_returns_none_for_zero_and_valid_for_positive():
assert wind_dir_to_text(0.0) is None
assert wind_dir_to_text(0) is None
# For a non-zero degree it should return some enum value
out = wind_dir_to_text(10.0)
assert out is not None
def test_heat_index_returns_none_when_missing_temp_or_humidity(monkeypatch):
monkeypatch.setattr(
"custom_components.sws12500.utils._LOGGER.error", lambda *a, **k: None
)
assert heat_index({OUTSIDE_HUMIDITY: "50"}) is None
assert heat_index({OUTSIDE_TEMP: "80"}) is None
assert heat_index({OUTSIDE_TEMP: "x", OUTSIDE_HUMIDITY: "50"}) is None
assert heat_index({OUTSIDE_TEMP: "80", OUTSIDE_HUMIDITY: "x"}) is None
def test_heat_index_simple_path_and_full_index_path():
# Simple path: keep simple average under threshold.
# Using temp=70F, rh=40 keeps ((simple+temp)/2) under 80 typically.
simple = heat_index({OUTSIDE_TEMP: "70", OUTSIDE_HUMIDITY: "40"})
assert simple is not None
# Full index path: choose high temp/rh -> triggers full index.
full = heat_index({OUTSIDE_TEMP: "90", OUTSIDE_HUMIDITY: "85"})
assert full is not None
def test_heat_index_low_humidity_adjustment_branch():
# This targets:
# if rh < 13 and (80 <= temp <= 112): adjustment = ...
#
# Pick a temp/rh combo that:
# - triggers the full-index path: ((simple + temp) / 2) > 80
# - satisfies low humidity adjustment bounds
out = heat_index({OUTSIDE_TEMP: "95", OUTSIDE_HUMIDITY: "10"})
assert out is not None
def test_heat_index_convert_from_celsius_path():
# If convert=True, temp is interpreted as Celsius and converted to Fahrenheit internally.
# Use 30C (~86F) and high humidity to trigger full index path.
out = heat_index({OUTSIDE_TEMP: "30", OUTSIDE_HUMIDITY: "85"}, convert=True)
assert out is not None
def test_chill_index_returns_none_when_missing_temp_or_wind(monkeypatch):
monkeypatch.setattr(
"custom_components.sws12500.utils._LOGGER.error", lambda *a, **k: None
)
assert chill_index({WIND_SPEED: "10"}) is None
assert chill_index({OUTSIDE_TEMP: "10"}) is None
assert chill_index({OUTSIDE_TEMP: "x", WIND_SPEED: "10"}) is None
assert chill_index({OUTSIDE_TEMP: "10", WIND_SPEED: "x"}) is None
def test_chill_index_returns_calculated_when_cold_and_windy():
# temp in F, wind > 3 -> calculate when temp < 50
out = chill_index({OUTSIDE_TEMP: "40", WIND_SPEED: "10"})
assert out is not None
assert isinstance(out, float)
def test_chill_index_returns_temp_when_not_cold_or_not_windy():
# Not cold -> hits the `else temp` branch
out1 = chill_index({OUTSIDE_TEMP: "60", WIND_SPEED: "10"})
assert out1 == 60.0
# Not windy -> hits the `else temp` branch
out2 = chill_index({OUTSIDE_TEMP: "40", WIND_SPEED: "2"})
assert out2 == 40.0
# Boundary: exactly 50F should also hit the `else temp` branch (since condition is temp < 50)
out3 = chill_index({OUTSIDE_TEMP: "50", WIND_SPEED: "10"})
assert out3 == 50.0
# Boundary: exactly 3 mph should also hit the `else temp` branch (since condition is wind > 3)
out4 = chill_index({OUTSIDE_TEMP: "40", WIND_SPEED: "3"})
assert out4 == 40.0
def test_chill_index_convert_from_celsius_path():
out = chill_index({OUTSIDE_TEMP: "5", WIND_SPEED: "10"}, convert=True)
assert out is not None

View File

@ -0,0 +1,254 @@
from __future__ import annotations
from dataclasses import dataclass
from types import SimpleNamespace
from typing import Any, Callable
from unittest.mock import MagicMock
import pytest
from custom_components.sws12500.const import DOMAIN
from custom_components.sws12500.sensor import WeatherSensor
@dataclass(slots=True)
class _DescriptionStub:
"""Minimal stand-in for WeatherSensorEntityDescription.
WeatherSensor only relies on:
- key
- value_fn
- value_from_data_fn
"""
key: str
value_fn: Callable[[Any], Any] | None = None
value_from_data_fn: Callable[[dict[str, Any]], Any] | None = None
class _CoordinatorStub:
"""Minimal coordinator stub used by WeatherSensor."""
def __init__(
self, data: dict[str, Any] | None = None, *, config: Any | None = None
):
self.data = data if data is not None else {}
self.config = config
def test_native_value_prefers_value_from_data_fn_success():
desc = _DescriptionStub(
key="derived",
value_from_data_fn=lambda data: f"v:{data.get('x')}",
value_fn=lambda raw: f"raw:{raw}", # should not be used
)
coordinator = _CoordinatorStub(data={"x": 123, "derived": "ignored"})
entity = WeatherSensor(desc, coordinator)
assert entity.native_value == "v:123"
def test_native_value_value_from_data_fn_success_with_dev_logging_hits_computed_debug_branch(
monkeypatch,
):
"""Ensure value_from_data_fn works with dev logging enabled."""
desc = _DescriptionStub(
key="derived",
value_from_data_fn=lambda data: data["x"] + 1,
)
config = SimpleNamespace(options={"dev_debug_checkbox": True})
coordinator = _CoordinatorStub(data={"x": 41}, config=config)
entity = WeatherSensor(desc, coordinator)
assert entity.native_value == 42
def test_native_value_value_from_data_fn_exception_returns_none():
def boom(_data: dict[str, Any]) -> Any:
raise RuntimeError("nope")
desc = _DescriptionStub(key="derived", value_from_data_fn=boom)
coordinator = _CoordinatorStub(data={"derived": 1})
entity = WeatherSensor(desc, coordinator)
assert entity.native_value is None
def test_native_value_missing_raw_returns_none():
desc = _DescriptionStub(key="missing", value_fn=lambda raw: raw)
coordinator = _CoordinatorStub(data={})
entity = WeatherSensor(desc, coordinator)
assert entity.native_value is None
def test_native_value_missing_raw_with_dev_logging_hits_debug_branch(monkeypatch):
monkeypatch.setattr(
"custom_components.sws12500.sensor._LOGGER.debug", lambda *a, **k: None
)
desc = _DescriptionStub(key="missing", value_fn=lambda raw: raw)
config = SimpleNamespace(options={"dev_debug_checkbox": True})
coordinator = _CoordinatorStub(data={}, config=config)
entity = WeatherSensor(desc, coordinator)
assert entity.native_value is None
def test_native_value_raw_none_with_dev_logging_hits_debug_branch(monkeypatch):
# This targets the `raw is None` branch (not empty string) and ensures the debug line
# is actually executed (coverage sometimes won't attribute it when data is missing).
called = {"debug": 0}
def _debug(*_a, **_k):
called["debug"] += 1
monkeypatch.setattr("custom_components.sws12500.sensor._LOGGER.debug", _debug)
desc = _DescriptionStub(key="k", value_fn=lambda raw: raw)
config = SimpleNamespace(options={"dev_debug_checkbox": True})
# Ensure the key exists and explicitly maps to None so `data.get(key)` returns None
# in a deterministic way for coverage.
coordinator = _CoordinatorStub(data={"k": None}, config=config)
entity = WeatherSensor(desc, coordinator)
assert entity.native_value is None
assert called["debug"] >= 1
def test_native_value_missing_raw_logs_specific_message(monkeypatch):
"""Target the exact debug log line for missing raw values.
This is meant to hit the specific `_LOGGER.debug("native_value missing raw: ...")`
statement to help achieve full `sensor.py` coverage.
"""
debug = MagicMock()
monkeypatch.setattr("custom_components.sws12500.sensor._LOGGER.debug", debug)
desc = _DescriptionStub(key="k", value_fn=lambda raw: raw)
config = SimpleNamespace(options={"dev_debug_checkbox": True})
coordinator = _CoordinatorStub(data={"k": None}, config=config)
entity = WeatherSensor(desc, coordinator)
assert entity.native_value is None
debug.assert_any_call("native_value missing raw: key=%s raw=%s", "k", None)
def test_native_value_empty_string_raw_returns_none():
desc = _DescriptionStub(key="k", value_fn=lambda raw: raw)
coordinator = _CoordinatorStub(data={"k": ""})
entity = WeatherSensor(desc, coordinator)
assert entity.native_value is None
def test_native_value_empty_string_raw_with_dev_logging_hits_debug_branch(monkeypatch):
monkeypatch.setattr(
"custom_components.sws12500.sensor._LOGGER.debug", lambda *a, **k: None
)
desc = _DescriptionStub(key="k", value_fn=lambda raw: raw)
config = SimpleNamespace(options={"dev_debug_checkbox": True})
coordinator = _CoordinatorStub(data={"k": ""}, config=config)
entity = WeatherSensor(desc, coordinator)
assert entity.native_value is None
def test_native_value_no_value_fn_returns_none():
desc = _DescriptionStub(key="k", value_fn=None)
coordinator = _CoordinatorStub(data={"k": 10})
entity = WeatherSensor(desc, coordinator)
assert entity.native_value is None
def test_native_value_no_value_fn_with_dev_logging_hits_debug_branch(monkeypatch):
monkeypatch.setattr(
"custom_components.sws12500.sensor._LOGGER.debug", lambda *a, **k: None
)
desc = _DescriptionStub(key="k", value_fn=None)
config = SimpleNamespace(options={"dev_debug_checkbox": True})
coordinator = _CoordinatorStub(data={"k": 10}, config=config)
entity = WeatherSensor(desc, coordinator)
assert entity.native_value is None
def test_native_value_value_fn_success():
desc = _DescriptionStub(key="k", value_fn=lambda raw: int(raw) + 1)
coordinator = _CoordinatorStub(data={"k": "41"})
entity = WeatherSensor(desc, coordinator)
assert entity.native_value == 42
def test_native_value_value_fn_success_with_dev_logging_hits_debug_branch(monkeypatch):
monkeypatch.setattr(
"custom_components.sws12500.sensor._LOGGER.debug", lambda *a, **k: None
)
desc = _DescriptionStub(key="k", value_fn=lambda raw: int(raw) + 1)
config = SimpleNamespace(options={"dev_debug_checkbox": True})
coordinator = _CoordinatorStub(data={"k": "41"}, config=config)
entity = WeatherSensor(desc, coordinator)
assert entity.native_value == 42
def test_native_value_value_fn_exception_returns_none():
def boom(_raw: Any) -> Any:
raise ValueError("bad")
desc = _DescriptionStub(key="k", value_fn=boom)
coordinator = _CoordinatorStub(data={"k": "x"})
entity = WeatherSensor(desc, coordinator)
assert entity.native_value is None
def test_suggested_entity_id_uses_sensor_domain_and_key(monkeypatch):
# `homeassistant.helpers.entity.generate_entity_id` requires either `current_ids` or `hass`.
# Our entity isn't attached to hass in this unit test, so patch it to a deterministic result.
monkeypatch.setattr(
"custom_components.sws12500.sensor.generate_entity_id",
lambda _fmt, key: f"sensor.{key}",
)
desc = _DescriptionStub(key="outside_temp", value_fn=lambda raw: raw)
coordinator = _CoordinatorStub(data={"outside_temp": 1})
entity = WeatherSensor(desc, coordinator)
suggested = entity.suggested_entity_id
assert suggested == "sensor.outside_temp"
def test_device_info_contains_expected_identifiers_and_domain():
desc = _DescriptionStub(key="k", value_fn=lambda raw: raw)
coordinator = _CoordinatorStub(data={"k": 1})
entity = WeatherSensor(desc, coordinator)
info = entity.device_info
assert info is not None
# DeviceInfo is mapping-like; access defensively.
assert info.get("name") == "Weather Station SWS 12500"
assert info.get("manufacturer") == "Schizza"
assert info.get("model") == "Weather Station SWS 12500"
identifiers = info.get("identifiers")
assert isinstance(identifiers, set)
assert (DOMAIN,) in identifiers
def test_dev_log_flag_reads_from_config_entry_options():
# When coordinator has a config with options, WeatherSensor should read dev_debug_checkbox.
desc = _DescriptionStub(key="k", value_fn=lambda raw: raw)
config = SimpleNamespace(options={"dev_debug_checkbox": True})
coordinator = _CoordinatorStub(data={"k": 1}, config=config)
entity = WeatherSensor(desc, coordinator)
# We don't assert logs; we just ensure native_value still works with dev logging enabled.
assert entity.native_value == 1

6
tests/test_windy_func.py Normal file
View File

@ -0,0 +1,6 @@
# Test file for windy_func.py module
def test_windy_func_functionality():
# Add your test cases here
pass

485
tests/test_windy_push.py Normal file
View File

@ -0,0 +1,485 @@
from __future__ import annotations
from dataclasses import dataclass
from datetime import datetime, timedelta
from types import SimpleNamespace
from typing import Any
from unittest.mock import AsyncMock, MagicMock
from aiohttp.client_exceptions import ClientError
import pytest
from custom_components.sws12500.const import (
PURGE_DATA,
WINDY_ENABLED,
WINDY_INVALID_KEY,
WINDY_LOGGER_ENABLED,
WINDY_NOT_INSERTED,
WINDY_STATION_ID,
WINDY_STATION_PW,
WINDY_SUCCESS,
WINDY_UNEXPECTED,
WINDY_URL,
)
from custom_components.sws12500.windy_func import (
WindyNotInserted,
WindyPasswordMissing,
WindyPush,
WindySuccess,
)
@dataclass(slots=True)
class _FakeResponse:
status: int
text_value: str = ""
async def text(self) -> str:
return self.text_value
async def __aenter__(self) -> "_FakeResponse":
return self
async def __aexit__(self, exc_type, exc, tb) -> None:
return None
class _FakeSession:
def __init__(
self, *, response: _FakeResponse | None = None, exc: Exception | None = None
):
self._response = response
self._exc = exc
self.calls: list[dict[str, Any]] = []
def get(
self,
url: str,
*,
params: dict[str, Any] | None = None,
headers: dict[str, str] | None = None,
):
self.calls.append(
{"url": url, "params": dict(params or {}), "headers": dict(headers or {})}
)
if self._exc is not None:
raise self._exc
assert self._response is not None
return self._response
@pytest.fixture
def hass():
# Use HA provided fixture if available; otherwise a minimal stub works because we patch session getter.
return SimpleNamespace()
def _make_entry(**options: Any):
defaults = {
WINDY_LOGGER_ENABLED: False,
WINDY_ENABLED: True,
WINDY_STATION_ID: "station",
WINDY_STATION_PW: "token",
}
defaults.update(options)
return SimpleNamespace(options=defaults)
def test_verify_windy_response_notice_raises_not_inserted(hass):
wp = WindyPush(hass, _make_entry())
with pytest.raises(WindyNotInserted):
wp.verify_windy_response(_FakeResponse(status=400, text_value="Bad Request"))
def test_verify_windy_response_success_raises_success(hass):
wp = WindyPush(hass, _make_entry())
with pytest.raises(WindySuccess):
wp.verify_windy_response(_FakeResponse(status=200, text_value="OK"))
def test_verify_windy_response_password_missing_raises(hass):
wp = WindyPush(hass, _make_entry())
with pytest.raises(WindyPasswordMissing):
wp.verify_windy_response(_FakeResponse(status=401, text_value="Unauthorized"))
def test_covert_wslink_to_pws_maps_keys(hass):
wp = WindyPush(hass, _make_entry())
data = {
"t1ws": "1",
"t1wgust": "2",
"t1wdir": "3",
"t1hum": "4",
"t1dew": "5",
"t1tem": "6",
"rbar": "7",
"t1rainhr": "8",
"t1uvi": "9",
"t1solrad": "10",
"other": "keep",
}
out = wp._covert_wslink_to_pws(data)
assert out["wind"] == "1"
assert out["gust"] == "2"
assert out["winddir"] == "3"
assert out["humidity"] == "4"
assert out["dewpoint"] == "5"
assert out["temp"] == "6"
assert out["mbar"] == "7"
assert out["precip"] == "8"
assert out["uv"] == "9"
assert out["solarradiation"] == "10"
assert out["other"] == "keep"
for k in (
"t1ws",
"t1wgust",
"t1wdir",
"t1hum",
"t1dew",
"t1tem",
"rbar",
"t1rainhr",
"t1uvi",
"t1solrad",
):
assert k not in out
@pytest.mark.asyncio
async def test_push_data_to_windy_respects_initial_next_update(monkeypatch, hass):
entry = _make_entry()
wp = WindyPush(hass, entry)
# Ensure "next_update > now" is true
wp.next_update = datetime.now() + timedelta(minutes=10)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: _FakeSession(response=_FakeResponse(status=200, text_value="OK")),
)
ok = await wp.push_data_to_windy({"a": "b"})
assert ok is False
@pytest.mark.asyncio
async def test_push_data_to_windy_purges_data_and_sets_auth(monkeypatch, hass):
entry = _make_entry(**{WINDY_LOGGER_ENABLED: True})
wp = WindyPush(hass, entry)
# Force it to send now
wp.next_update = datetime.now() - timedelta(seconds=1)
session = _FakeSession(response=_FakeResponse(status=200, text_value="OK"))
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: session,
)
data = {k: "x" for k in PURGE_DATA}
data.update({"keep": "1"})
ok = await wp.push_data_to_windy(data, wslink=False)
assert ok is True
assert len(session.calls) == 1
call = session.calls[0]
assert call["url"] == WINDY_URL
# Purged keys removed
for k in PURGE_DATA:
assert k not in call["params"]
# Added keys
assert call["params"]["id"] == entry.options[WINDY_STATION_ID]
assert call["params"]["time"] == "now"
assert (
call["headers"]["Authorization"] == f"Bearer {entry.options[WINDY_STATION_PW]}"
)
@pytest.mark.asyncio
async def test_push_data_to_windy_wslink_conversion_applied(monkeypatch, hass):
entry = _make_entry()
wp = WindyPush(hass, entry)
wp.next_update = datetime.now() - timedelta(seconds=1)
session = _FakeSession(response=_FakeResponse(status=200, text_value="OK"))
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: session,
)
ok = await wp.push_data_to_windy({"t1ws": "1", "t1tem": "2"}, wslink=True)
assert ok is True
params = session.calls[0]["params"]
assert "wind" in params and params["wind"] == "1"
assert "temp" in params and params["temp"] == "2"
assert "t1ws" not in params and "t1tem" not in params
@pytest.mark.asyncio
async def test_push_data_to_windy_missing_station_id_returns_false(monkeypatch, hass):
entry = _make_entry()
entry.options.pop(WINDY_STATION_ID)
wp = WindyPush(hass, entry)
wp.next_update = datetime.now() - timedelta(seconds=1)
session = _FakeSession(response=_FakeResponse(status=200, text_value="OK"))
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: session,
)
update_options = AsyncMock(return_value=True)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.update_options", update_options
)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.persistent_notification.create",
MagicMock(),
)
ok = await wp.push_data_to_windy({"a": "b"})
assert ok is False
assert session.calls == []
@pytest.mark.asyncio
async def test_push_data_to_windy_missing_station_pw_returns_false(monkeypatch, hass):
entry = _make_entry()
entry.options.pop(WINDY_STATION_PW)
wp = WindyPush(hass, entry)
wp.next_update = datetime.now() - timedelta(seconds=1)
session = _FakeSession(response=_FakeResponse(status=200, text_value="OK"))
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: session,
)
update_options = AsyncMock(return_value=True)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.update_options", update_options
)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.persistent_notification.create",
MagicMock(),
)
ok = await wp.push_data_to_windy({"a": "b"})
assert ok is False
assert session.calls == []
@pytest.mark.asyncio
async def test_push_data_to_windy_invalid_api_key_disables_windy(monkeypatch, hass):
entry = _make_entry()
wp = WindyPush(hass, entry)
wp.next_update = datetime.now() - timedelta(seconds=1)
# Response triggers WindyPasswordMissing (401)
session = _FakeSession(
response=_FakeResponse(status=401, text_value="Unauthorized")
)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: session,
)
update_options = AsyncMock(return_value=True)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.update_options", update_options
)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.persistent_notification.create",
MagicMock(),
)
ok = await wp.push_data_to_windy({"a": "b"})
assert ok is True
update_options.assert_awaited_once_with(hass, entry, WINDY_ENABLED, False)
@pytest.mark.asyncio
async def test_push_data_to_windy_invalid_api_key_update_options_failure_logs_debug(
monkeypatch, hass
):
entry = _make_entry()
wp = WindyPush(hass, entry)
wp.next_update = datetime.now() - timedelta(seconds=1)
session = _FakeSession(
response=_FakeResponse(status=401, text_value="Unauthorized")
)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: session,
)
update_options = AsyncMock(return_value=False)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.update_options", update_options
)
dbg = MagicMock()
monkeypatch.setattr("custom_components.sws12500.windy_func._LOGGER.debug", dbg)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.persistent_notification.create",
MagicMock(),
)
ok = await wp.push_data_to_windy({"a": "b"})
assert ok is True
update_options.assert_awaited_once_with(hass, entry, WINDY_ENABLED, False)
dbg.assert_called()
@pytest.mark.asyncio
async def test_push_data_to_windy_notice_logs_not_inserted(monkeypatch, hass):
entry = _make_entry(**{WINDY_LOGGER_ENABLED: True})
wp = WindyPush(hass, entry)
wp.next_update = datetime.now() - timedelta(seconds=1)
session = _FakeSession(response=_FakeResponse(status=400, text_value="Bad Request"))
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: session,
)
err = MagicMock()
monkeypatch.setattr("custom_components.sws12500.windy_func._LOGGER.error", err)
ok = await wp.push_data_to_windy({"a": "b"})
assert ok is True
# It logs WINDY_NOT_INSERTED regardless of log setting
err.assert_called()
@pytest.mark.asyncio
async def test_push_data_to_windy_success_logs_info_when_logger_enabled(
monkeypatch, hass
):
entry = _make_entry(**{WINDY_LOGGER_ENABLED: True})
wp = WindyPush(hass, entry)
wp.next_update = datetime.now() - timedelta(seconds=1)
session = _FakeSession(response=_FakeResponse(status=200, text_value="OK"))
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: session,
)
info = MagicMock()
monkeypatch.setattr("custom_components.sws12500.windy_func._LOGGER.info", info)
ok = await wp.push_data_to_windy({"a": "b"})
assert ok is True
# It should log WINDY_SUCCESS (or at least call info) when logging is enabled
info.assert_called()
@pytest.mark.asyncio
async def test_push_data_to_windy_verify_no_raise_logs_debug_not_inserted_when_logger_enabled(
monkeypatch, hass
):
"""Cover the `else:` branch when `verify_windy_response` does not raise.
This is a defensive branch in `push_data_to_windy`:
try: verify(...)
except ...:
else:
if self.log:
_LOGGER.debug(WINDY_NOT_INSERTED)
"""
entry = _make_entry(**{WINDY_LOGGER_ENABLED: True})
wp = WindyPush(hass, entry)
wp.next_update = datetime.now() - timedelta(seconds=1)
# Response text that does not contain any of the known markers (NOTICE/SUCCESS/Invalid/Unauthorized)
session = _FakeSession(response=_FakeResponse(status=500, text_value="Error"))
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: session,
)
debug = MagicMock()
monkeypatch.setattr("custom_components.sws12500.windy_func._LOGGER.debug", debug)
ok = await wp.push_data_to_windy({"a": "b"})
assert ok is True
debug.assert_called()
@pytest.mark.asyncio
async def test_push_data_to_windy_client_error_increments_and_disables_after_three(
monkeypatch, hass
):
entry = _make_entry()
wp = WindyPush(hass, entry)
wp.next_update = datetime.now() - timedelta(seconds=1)
update_options = AsyncMock(return_value=True)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.update_options", update_options
)
crit = MagicMock()
monkeypatch.setattr("custom_components.sws12500.windy_func._LOGGER.critical", crit)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.persistent_notification.create",
MagicMock(),
)
# Cause ClientError on session.get
session = _FakeSession(exc=ClientError("boom"))
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: session,
)
# First 3 calls should not disable; 4th should
for i in range(4):
wp.next_update = datetime.now() - timedelta(seconds=1)
ok = await wp.push_data_to_windy({"a": "b"})
assert ok is True
assert wp.invalid_response_count == 4
# update_options awaited once when count > 3
update_options.assert_awaited()
args = update_options.await_args.args
assert args[2] == WINDY_ENABLED
assert args[3] is False
# It should log WINDY_UNEXPECTED at least once
assert any(
WINDY_UNEXPECTED in str(c.args[0]) for c in crit.call_args_list if c.args
)
@pytest.mark.asyncio
async def test_push_data_to_windy_client_error_disable_failure_logs_debug(
monkeypatch, hass
):
entry = _make_entry()
wp = WindyPush(hass, entry)
wp.invalid_response_count = 3 # next error will push it over the threshold
wp.next_update = datetime.now() - timedelta(seconds=1)
update_options = AsyncMock(return_value=False)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.update_options", update_options
)
dbg = MagicMock()
monkeypatch.setattr("custom_components.sws12500.windy_func._LOGGER.debug", dbg)
monkeypatch.setattr(
"custom_components.sws12500.windy_func.persistent_notification.create",
MagicMock(),
)
session = _FakeSession(exc=ClientError("boom"))
monkeypatch.setattr(
"custom_components.sws12500.windy_func.async_get_clientsession",
lambda _h: session,
)
ok = await wp.push_data_to_windy({"a": "b"})
assert ok is True
update_options.assert_awaited_once_with(hass, entry, WINDY_ENABLED, False)
dbg.assert_called()