2021.7: A new entity, trigger IDs and script debugging

Happy July, which means Home Assistant Core 2021.7!

An interesting release, with a bunch of little goodies to make things easier in
creating automations, scripts and doing templating. Those are things that in
general, make me very happy. Mainly because, well, I use Home Assistant to
automate 😁

Also, we are saying “hi!” 👋 to a new type of entity, which is really exciting
and I can’t wait to see how that is being put to use in the future.

Lastly, I want to give a shout-out to @klaasnicolaas! He has been an intern
with Nabu Casa for the last months. Besides doing the community highlights, he
has been working on some awesome stuff that will land in upcoming Home Assistant
releases.

His internship is now over, and he passed with a nice grade. Yet, he could not
leave without a little present as it seems. He contributed the
Forecast.Solar integration, bringing in energy production
forecasting for your solar panels. Really cool!

Alright, that’s it! Enjoy the release!

../Frenck

New entity: Select

In this release, we welcome the select entity to the Home Assistant family. The
select entity is a close relative of the dropdown helper (also known as
input_select).

The difference is that while the input select is configured and managed by you,
the select entities are provided by integrations.

This means integrations can now provide entities that give a choice.
Either in the Lovelace UI, but also via automations using services,
and via the Google Assistant.

Screenshot of a select entity, providing a choice from a list of options
Screenshot of a select entity, providing a choice from a list of options.

Some integrations started implementing the first select entities as of this
release. MQTT & KNX made it available for use, WLED uses it to provide
controls on selecting and activating a user preset, and with Rituals Perfume
Genie you can now change the room size for your diffuser.

Trigger conditions and trigger IDs

If you are creating some complex automations in YAML, you might be familiar with
this. Consider a big automation, with a whole bunch of triggers. But how
would you know which of those triggers actually triggered the automation?

You can now assign an id to your triggers that is passed into automation when
triggered, allowing you to make decisions on it.

automation:
  - alias: "Trigger IDs!"
    trigger:
      - platform: state
        id: "normal"
        entity_id: binary_sensor.gate
        state: "on"
      - platform: state
        id: "forgotten"
        entity_id: binary_sensor.gate
        state: "on"
        for:
          minutes: 10
    ...

The above example triggers the same automation twice, when the gate opens
and when the gate is left open for 10 minutes (probably forgotten). Each
trigger has its own ID.

Now introducing the new trigger condition! So you can add a condition on which
trigger fired the automation.

automation:
  - alias: "Trigger IDs!"
    ...
    action:
      ...
      - condition: trigger
        id: "forgotten"
      - service: notify.frenck_iphone
        data:
          message: "Someone left the gate open..."

You can use the trigger condition in all places where all the other conditions
work as well, including things like
choose from a group of actions.

Rather use the UI to create and manage your automations? No problem! These new
features have been added to the automation editor as well!

Screenshot of using a trigger condition in the automation editor
Screenshot of using a trigger condition in the automation editor.

Script debugging

In Home Assistant Core 2021.4,
we added the ability to debug automations. In this release, we’ve made these
same powerful tools available for scripts!

So, this helps for the next time you are wondering: Why didn’t that script work?
Or why did it behave as it did? What the script is going on here?

Screenshot of using the new script debugger on my office announce script
Screenshot of using the new script debugger on my office announce script.

The above screenshot shows a previous run of a script, using an interactive
graph for each step in this script; with the path it took highlighted.
Each node in the graph can be clicked to view the details of what happened
on each step in the script sequence.

Referencing other entities in triggers and conditions

A small, but possibly helpful, change to our script and automations.
You can now reference other entities for the above/below values of numeric
state triggers and conditions. Both sensors and number entities can be used.

For example, you can now trigger an automation if the outside temperature
is higher than the temperature inside.

automation:
  - alias: "Notify to close the window"
    trigger:
      - platform: numeric_state
        entity_id: sensor.outside_temperature
        above: sensor.inside_temperature
    action:
      - service: notify.frenck_iphone
        data:
          message: "Close all windows, it is warm outside!"

The numeric state conditions supports the same.

Additionally, the time conditions now support a similar thing using other
sensors that provide a time in the before and after options. Time triggers
added support for that already in a previous release.

Working with dates in templates

If you ever tried to work with dates in templates, you probably know that that
is hard. And honestly, that will never go away, times, dates and timezones are
complex little beasts.

However, we realized that the hardest part of using date & times with templates
is converting the state of a sensor or text to a datetime. This
release adds a small template method to help with that: as_datetime.

It can be used as a filter or as a method. Here is an example of
calculating the number of days until my drivers’ license expires:

{{ (states('sensor.drivers_license') | as_datetime - now()).days }} days

Series version tags for Docker containers

If you are using the Home Assistant Container installation method,
we recommend using a specific version tag; however, that means
you need to update the version tag each time we release a new patch version
of Home Assistant.

Thanks to @kmdm, as of this release, we also provide a series version tag
that always points to the latest patch version of that release, in addition
to all existing tags we already provide.

docker pull ghcr.io/home-assistant/home-assistant:2021.7

The 2021.7, will contain the latest July release, even if that is
actually version 2021.7.2.

Other noteworthy changes

There is much more juice in this release; here are some of the other
noteworthy changes this release:

  • Z-Wave JS got quite a few updates this release:
    • A new zwave_js.multicast_set_value is available, allowing to issue
      a set value command via multicast. Thanks, @raman325!
    • Each node now has a status sensor available and can be pinged using the
      new zwave_js.ping service. Added by @raman325.
    • The Z-Wave JS configuration panel now has a “Heal Network” button,
      thanks @cgarwood!
    • Z-Wave JS Server connection can now be re-configured from the Z-Wave JS
      configuration panel, added by @MartinHjelmare.
    • Z-Wave JS logs can now be downloaded, thanks @raman325!
  • The Google Assistant integration now has support for fan speed percentages and
    preset modes. Thanks, @jbouwh!
  • @jbouwh didn’t stop there and added fan preset mode support to Alexa too!
  • The Philips TV integration now supports Ambilights, added by @elupus.
  • Yamaha MusicCast integration now supports grouping services, thanks @micha91!
  • @raman325 added a whole bunch of sensors to the ClimaCell integration!
  • WLED now supports local push. Updates are now instantly both ways. Also,
    the master light can be kept and added support for controlling user presets.
  • Setting up Xiaomi devices has gotten way easier! There is no need to do
    difficult things to get the tokens. Instead, Home Assistant can now extract
    the tokens from a Xiaomi Cloud account. Thanks, @starkillerOG!
  • More Xiaomi updates, @jbouwh added support for fan percentage-based speeds
    and preset modes.
  • @RenierM26 added a lot of new services to the Ezviz integration, thanks!
  • Tibber had quite a few improvements and now provides a power factor sensor,
    added by @Danielhiversen!
  • Google Translate TTS now supports the Bulgarian language,
    thanks @hristo-atanasov!
  • If you have a SmartTube, you can now reset your reminders, thanks @mdz!
  • KNX had quite a lot of updates and added support for XY-color lights,
    thanks @farmio.
  • @OttoWinter added support for presets, custom presets and custom fan modes
    for climate controls in ESPHome. Awesome!
  • Nuki now has a service to enable/disable continuous mode, thanks @anaisbetts!
  • @cgomesu added quantiles to Statistics integration, thanks!
  • The Home Assistant login page now better support password manager,
    thanks, @rianadon!

New Integrations

We welcome the following new integrations this release:

New Platforms

The following integration got support for a new platform:

Integrations now available to set up from the UI

The following integrations are now available via the Home Assistant UI:

Release 2021.7.1 – July 8

  • Fix service registration typo in Nuki integration (@anaisbetts – #52631) (nuki docs)
  • Fix Fritz default consider home value (@chemelli74 – #52648) (fritz docs)
  • Handle KeyError when accessing device information (@ludeeus – #52650) (ecovacs docs)
  • Warn if interface_addr remains in Sonos configuration (@jjlawren – #52652) (sonos docs)
  • Ignore unused keys from Sonos device properties callback (@jjlawren – #52660) (sonos docs)
  • Ensure Forecast.Solar returns an iso formatted timestamp (@frenck – #52669) (forecast_solar docs)
  • Use iso-formatted times in MetOffice weather forecast (@avee87 – #52672) (metoffice docs)
  • Fix precipitation calculation for hourly forecast (@agners – #52676) (openweathermap docs)
  • Move recorder.py import to runtime (@uvjustin – #52682) (stream docs)
  • Bump simplisafe-python to 11.0.1 (@bachya – #52684) (simplisafe docs)
  • pyWeMo version bump (0.6.5) (@esev – #52701) (wemo docs)
  • Bump pylutron to 0.2.8 fixing python 3.9 incompatibility (@JonGilmore – #52702) (lutron docs)
  • Add check for _client existence in modbus (@janiversen – #52719) (modbus docs)
  • Fix KNX Fan features (@Tommatheussen – #52732) (fan docs)
  • Esphome fix camera image (@jesserockz#52738) (esphome docs)

Release 2021.7.2 – July 12

  • Ignore Sonos Boost devices during discovery (@jjlawren – #52845) (sonos docs)
  • Add zeroconf discovery to Sonos (@bdraco – #52655) (sonos docs)
  • Remove scale calculation for climacell cloud cover (@apaperclip – #52752) (climacell docs)
  • Fix homebridge devices becoming unavailable frequently (@Jc2k – #52753) (homekit_controller docs)
  • Fix nexia thermostats humidify without dehumidify support (@bdraco – #52758) (nexia docs)
  • Support certain homekit devices that emit invalid JSON (@Jc2k – #52759) (homekit_controller docs)
  • Send ssdp requests to ipv4 broadcast as well (@bdraco – #52760) (ssdp docs)
  • Bump dependency to properly handle current and voltage not being reported on some zhapower endpoints (@Kane610 – #52764) (deconz docs)
  • Upgrade pymazda to 0.2.0 (@bdr99 – #52775)
  • Fix ESPHome Camera not merging image packets (@OttoWinter – #52783) (esphome docs)
  • Fix Neato parameter for token refresh (@chemelli74 – #52785) (neato docs)
  • Add the Trane brand to nexia (@bdraco – #52805) (nexia docs)
  • Bump python-fireservicerota to 0.0.42 (@cyberjunky – #52807) (fireservicerota docs)
  • Bump up ZHA depdencies (@Adminiuga – #52818) (zha docs)
  • Update arcam lib to 0.7.0 (@elupus – #52829) (arcam_fmj docs)
  • Bump aiohomekit to 0.5.1 to solve performance regression (@bdraco – #52878) (homekit_controller docs)
  • Bump pyhaversion to 21.7.0 (@ludeeus – #52880) (version docs)
  • Prefer using xy over hs when supported by light (@Kane610 – #52883) (deconz docs)
  • Bump zwave-js-server-python to 0.27.1 (@raman325 – #52885) (zwave_js docs)
  • Surepetcare, fix set_lock_state (@Danielhiversen – #52912) (surepetcare docs)
  • Bump pyinsteon to 1.0.11 (@teharris1 – #52927) (insteon docs)
  • Fix recorder purge with sqlite3 < 3.32.0 (@bdraco – #52929)
  • Bump pysonos to 0.0.52 (@jjlawren – #52934) (sonos docs)

Release 2021.7.3 – July 16

  • Update ZHA to support zigpy 0.34.0 device initialization (@puddly – #52610) (zha docs)
  • copy() –> deepcopy(). (@janiversen#52794) (modbus docs)
  • only allow one active call in each platform. (@janiversen#52823) (modbus docs)
  • Bump pyatv to 0.8.1 (@doug-hoffman – #52849) (apple_tv docs)
  • Handle dhcp packets without a hostname (@bdraco – #52882) (dhcp docs)
  • Add OUIs for legacy samsungtv (@bdraco – #52928) (samsungtv docs)
  • Bump python-fireservicerota to 0.0.43 (@cyberjunky – #52966) (fireservicerota docs)
  • More graceful exception handling in Plex library sensors (@jjlawren – #52969) (plex docs)
  • Fix issue connecting to Insteon Hub v2 (@teharris1 – #52970) (insteon docs)
  • Bump pysma to 0.6.4 (@rklomp – #52973) (sma docs)
  • Update pyrainbird to 0.4.3 (@peternijssen – #52990) (rainbird docs)
  • Bump pypck to 0.7.10 (@alengwenus – #53013) (lcn docs)
  • fix for timestamp not present in SIA (@eavanvalkenburg – #53015) (sia docs)
  • Co2signal, set SCAN_INTERVAL (@Danielhiversen – #53023) (co2signal docs)
  • Another SIA fix for timestamp not present. (@eavanvalkenburg#53045)
  • Fix knx expose feature not correctly falling back to default value (@da-anda – #53046) (knx docs)
  • Expose Spotify as a service (@balloob – #53063)
  • Increase polling interval to prevent reaching daily limit (@vlebourl – #53066) (home_plus_control docs)
  • Add light white parameter to light/services.yaml (@emontnemery#53075) (light docs)

Release 2021.7.4 – July 21

  • Allow pymodbus to reconnect in running system (not startup) (@janiversen – #53020) (modbus docs)
  • Fix groups reporting incorrect supported color modes (@Kane610 – #53088) (deconz docs)
  • Handle all WeMo ensure_long_press_virtual_device exceptions (@esev – #53094) (wemo docs)
  • Fix remote rpi gpio input type (@jgriff2 – #53108) (remote_rpi_gpio docs)
  • More restrictive state updates of UniFi uptime sensor (@Kane610 – #53111) (unifi docs)
  • Bump simplisafe-python to 11.0.2 (@bachya – #53121) (simplisafe docs)
  • Bump nexia to 0.9.10 to fix asair login (@bdraco – #53122) (nexia docs)
  • Bump surepy to 0.7.0 (@benleb – #53123) (surepetcare docs)
  • Upgrade pysonos to 0.0.53 (@amelchio – #53137) (sonos docs)
  • Correctly detect is not home (@balloob – #53279) (device_tracker docs)
  • Upgrade to async-upnp-client==0.19.1 (@StevenLooman – #53288) (dlna_dmr docs) (ssdp docs) (upnp docs)
  • Fix homekit locks not being created from when setup from the UI (@bdraco#53301) (homekit docs)

If you need help…

…don’t hesitate to use our very active forums or join us for a little chat.

Experiencing issues introduced by this release? Please report them in our issue tracker. Make sure to fill in all fields of the issue template.

Nothing founder Carl Pei on Ear (1) and building a hardware startup from scratch

On July 27, hardware maker Nothing will debut its first product, wireless earbuds dubbed Ear (1). Despite releasing almost no tangible information about the product, the company has managed to generate substantial buzz around the launch — especially for an entry into the already-crowded wireless earbud market.

The hype, however, is real — and somewhat understandable. Nothing founder Carl Pei has a good track record in the industry — he was just 24 when he co-founded OnePlus in 2013. The company has done a canny job capitalizing on heightened expectations, meting out information about the product like pieces in a puzzle.

We spoke to Pei ahead of the upcoming launch to get some insight into Ear (1) and the story behind Nothing.

TC: I know there was a timing delay with the launch. Was that related to COVID-19 and supply chain issues?

CP: Actually, it was due to our design. Maybe you’ve seen the concept image of this transparent design. It turns out there’s a reason why there aren’t many transparent consumer tech products out there. It’s really, really hard to make it high quality. You need to ensure that everything inside looks just as good as the outside. So that’s where the team has been iterating, [but] you probably wouldn’t notice the differences between each iteration.

It could be getting the right magnets — as magnets are usually designed to go inside of a product and not be seen by the consumer — to figuring out the best type of gluing. You never have to solve that problem if you have a non-transparent product, but what kind of glue will keep the industrial design intact? I think the main issue has been getting the design ready. And we’re super, super close. Hopefully, it will be a product that people are really excited about when we launch.

So, there were no major supply chain issues?

Not for this product category. With true wireless earbuds, I think we’re pretty fine. No major issues. I mean, we had the issue that we started from zero — so no team and no partners. But step by step, we finally got here.

That seems to imply that you’re at least thinking ahead toward the other products. Have you already started developing them?

We have a lot of products in the pipeline. Earlier this year, we did a community crowdfunding round where we allocated $1.5 million to our community. That got bought up really quickly. But as part of that funding round, we had a deck with some of the products in development. Our products are code-named as Pokémon, so there are a lot of Pokémon on that slide [Ed note: The Ear (1) was “Aipom.”]. We have multiple categories that we’re looking at, but we haven’t really announced what those are.

Why were earbuds the right first step?

I think this market is really screaming for differentiation. If you look at true wireless today, I think after Apple came out with the AirPods, the entire market kind of followed. Everybody wears different clothes. This is something we wear for a large part of the day. Why wouldn’t people want different designs?

We’re working with Teenage Engineering — they’re super, super strong designers. I think true wireless is a place where we can really leverage that strength. Also, from a more rational business perspective, wireless earbuds is a super-fast-growing product category. I think we’re going to reach 300 million units shipped worldwide this year for this category. And your first product category should be one with good business potential.

“Screaming for differentiation” is an interesting way to put it. When you look at AirPods and the rest of the industry, are aesthetics what the market primarily lacks? Is it features or is it purely stylistic?

If we take a take a step back and think about it from a consumer perspective, we feel like, as a whole, consumer tech is quite, quite boring. Kids used to want to become engineers and astronauts and all that. But if you look at what kids want to become today, they want to be TikTokers or YouTubers. Maybe it’s because technology isn’t as inspiring as before. We talked to consumers, and they don’t care as much as a couple of years ago either. If you look at what what brands are doing in their communication, it’s all about features and specs.

Apple just released the first iOS 15 beta to everyone

This is your opportunity to get a glimpse of the future of iOS, iPadOS and watchOS. Apple just released the first public beta of iOS 15, iPadOS 15 and watchOS 8. Those releases are the next major versions of the operating systems for the iPhone, iPad and Apple Watch. Unlike developer betas, everyone can download these betas — you don’t need a $99 developer account. But don’t forget, it’s a beta.

The company still plans to release the final version of iOS 15, iPadOS 15 and watchOS 8 this fall. But Apple is going to release betas every few weeks over the summer. It’s a good way to fix as many bugs as possible and gather data from a large group of users.

As always, Apple’s public betas closely follow the release cycle of developer betas. Apple also released the second developer beta of iOS and iPadOS 15 today. So it sounds like the first public beta is more or less the same build as the second developer build.

But remember, you shouldn’t install a beta on your primary iPhone or iPad. The issue is not just bugs — some apps and features won’t work at all. In some rare cases, beta software can also brick your device and make it unusable. You may even lose data on iCloud. Proceed with extreme caution.

But if you have an iPad, iPhone or Apple Watch you don’t need, here’s how to download it. Head over to Apple’s beta website from the device you want to use for the beta and download the configuration profile — do that from your iPhone for the watchOS beta. It’s a tiny file that tells your device to update to public betas like it’s a normal software update.

Once it’s installed, reboot your device, then head over to the Settings (or Watch) app. You should see an update. In September, your device should automatically update to the final version of iOS 15, iPadOS 15 or watchOS 8 and you’ll be able to delete the configuration profile.

The biggest change of iOS 15 is a new Focus mode. In addition to “Do not disturb,” you can configure various modes — you can choose apps and people you want notifications from and change your focus depending on what you’re doing. For instance, you can create a Work mode, a Sleep mode, a Workout mode, etc.

There are many new features across the board, such as a new Weather app, updated maps in Apple Maps, an improved version of FaceTime with SharePlay and more. Safari also has a brand-new look.

 

habitat-compare

Facebook and Matterport collaborate on realistic virtual training environments for AI

To train a robot to navigate a house, you either need to give it a lot of real time in a lot of real houses, or a lot of virtual time in a lot of virtual houses. The latter is definitely the better option, and Facebook and Matterport are working together to make thousands of virtual, interactive digital twins of real spaces available for researchers and their voracious young AIs.

On Facebook’s side the big advance is in two parts: the new Habitat 2.0 training environment and the dataset they created to enable it. You may remember Habitat from a couple years back; in the pursuit of what it calls “embodied AI,” which is to say AI models that interact with the real world, Facebook assembled a number of passably photorealistic virtual environments for them to navigate.

Many robots and AIs have learned things like movement and object recognition in idealized, unrealistic spaces that resemble games more than reality. A real-world living room is a very different thing from a reconstructed one. By learning to move about in something that looks like reality, an AI’s knowledge will transfer more readily to real-world applications like home robotics.

But ultimately these environments were only polygon-deep, with minimal interaction and no real physical simulation — if a robot bumps into a table, it doesn’t fall over and spill items everywhere. The robot could go to the kitchen, but it couldn’t open the fridge or pull something out of the sink. Habitat 2.0 and the new ReplicaCAD dataset change that with increased interactivity and 3D objects instead of simply interpreted 3D surfaces.

Simulated robots in these new apartment-scale environments can roll around like before, but when they arrive at an object, they can actually do something with it. For instance if a robot’s task is to pick up a fork from the dining room table and go place it in the sink, a couple years ago picking up and putting down the fork would just be assumed, since you couldn’t actually simulate it effectively. In the new Habitat system the fork is physically simulated, as is the table it’s on, the sink it’s going to, and so on. That makes it more computationally intense, but also way more useful.

They’re not the first to get to this stage by a long shot, but the whole field is moving along at a rapid clip and each time a new system comes out it leapfrogs the others in some ways and points at the next big bottleneck or opportunity. In this case Habitat 2.0’s nearest competition is probably AI2’s ManipulaTHOR, which combines room-scale environments with physical object simulation.

Where Habitat has it beat is in speed: according to the paper describing it, the simulator can run roughly 50-100 times faster, which means a robot can get that much more training done per second of computation. (The comparisons aren’t exact by any means and the systems are distinct in other ways.)

The dataset used for it is called ReplicaCAD, and it’s essentially the original room-level scans recreated with custom 3D models. This is a painstaking manual process, Facebook admitted, and they’re looking into ways of scaling it, but it provides a very useful end product.

The original scanned room, above, and ReplicaCAD 3D recreation, below.

More detail and more types of physical simulation are on the roadmap — basic objects, movements, and robotic presences are supported, but fidelity had to give way for speed at this stage.

Matterport is also making some big moves in partnership with Facebook. After making a huge platform expansion over the last couple years, the company has assembled an enormous collection of 3D-scanned buildings. Though it has worked with researchers before, the company decided it was time to make a larger part of its trove available to the community.

“We’ve Matterported every type of physical structure in existence, or close to it. Homes, high-rises, hospitals, office spaces, cruise ships, jets, Taco Bells, McDonalds… and all the info that is contained in a digital twin is very important to research,” CEO RJ Pittman told me. “We thought for sure this would have implications for everything from doing computer vision to robotics to identifying household objects. Facebook didn’t need any convincing… for Habitat and embodied AI it is right down the center of the fairway.”

To that end it created a dataset, HM3D, of a thousand meticulously 3D-captured interiors, from the home scans that real estate browsers may recognize to businesses and public spaces. It’s the largest such collection that has been made widely available.

Image Credits: Matterport

The environments, which are scanned an interpreted by an AI trained on precise digital twins, are dimensionally accurate to the point where, for example, exact numbers for window surface area or total closet volume can be calculated. It’s a helpfully realistic playground for AI models, and while the resulting dataset isn’t interactive (yet) it is very reflective of the real world in all its variance. (It’s distinct from the Facebook interactive dataset but could form the basis for an expansion.)

“It is specifically a diversified dataset,” said Pittman. “We wanted to be sure we had a rich grouping of different real world environments — you need that diversity of data if you want to get the most mileage out of it training an AI or robot.”

All the data was volunteered by the owners of the spaces, so don’t worry that it’s been sucked up unethically by some small print. Ultimately, Pittman explained, the company wants to create a larger, more parameterized dataset that can be accessed by API — realistic virtual spaces as a service, basically.

“Maybe you’re building a hospitality robot, for bed and breakfasts of a certain style in the U.S — wouldn’t it be great to be able to get a thousand of those?” he mused. “We want to see how far we can push advancements with this first dataset, get those learnings, then continue to work with the research community and our own developers and go from there. This is an important launching point for us.”

Both datasets will be open and available for researchers everywhere to use.

Community Highlights: 27th edition

The 27th edition of the Home Assistant Community Highlights! Some interesting
things popped up around our community, which we thought was worth sharing.

But first I have one more announcement. My internship is (almost) over and I’m
happy to say that I passed and completed it with a good grade. That also means that
this community highlight is the last one from me as an intern, time will tell
when the next one will appear.

Do you want to share something for the next edition?
Information on how to share.

./Klaas
Intern on Home Assistant Energy

Blueprint of the week


This week’s blueprint is that of bfranke1973,
who has created a blueprint with which you can receive a notification when
a device loses connection with the network. Try it out! Read more about it
on the community forum or install this automation in your
instance with a click on the My button!

Love Lock Card


Do you ever accidentally turn on a light or switch? Then try the
love lock card made by
CyrisXD. With this, you can create a card
that is locked and you can unlock it with, for example, a pin code or a
simple click on the card.

Lovelace Dashboard


This week again we have a new Lovelace dashboard for the necessary portion
of inspiration 😄 This time the one from swake88
who is improving his dashboard
for use on a mobile device. If you want to know more about it, check out
the comment part on Reddit.

Would you also like your dashboard to be in the community highlight? Drop it
on Reddit and maybe I’ll pick it out for the next edition.

Statistic Tools


Home Assistant has a number of tools for working with statistics, but the
question is how can you best approach this. Carlos
has written a very good guide on
how to get started and how to make cards that use this data.

Got a tip for the next edition?


Have you seen (or made) something awesome, interesting, unique, amazing,
inspirational, unusual or funny, using Home Assistant?

Click here to send us your Community Highlight suggestion.

Also, don’t forget to share your creations with us via Social Media:

See you next edition!

nixie-dip

Nixie’s drone-based water sampling could save cities time and money

Regularly testing waterways and reservoirs is a never-ending responsibility for utility companies and municipal safety authorities, and generally — as you might expect — involves either a boat or at least a pair of waders. Nixie does the job with a drone instead, making the process faster, cheaper and a lot less wet.

The most common methods of testing water quality haven’t changed in a long time, partly because they’re effective and straightforward, and partly because really, what else are you going to do? No software or web platform out there is going to reach into the middle of the river and pull out a liter of water.

But with the advent of drones powerful and reliable enough to deploy in professional and industrial circumstances, the situation has changed. Nixie is a solution by the drone specialists at Reign Maker, involving either a custom-built sample collection arm or an in-situ sensor arm.

The sample collector is basically a long vertical arm with a locking cage for a sample container. You put the empty container in there, fly the drone out to the location, then submerge the arm. When it flies back, the filled container can be taken out while the drone hovers and a fresh one put in its place to bring to the next spot. (This switch can be done safely in winds up to 18 MPH and sampling in currents up to 5 knots, the company said.)

Image Credits: Reign Maker

This allows for quick sampling at multiple locations — the drone’s battery will last about 20 minutes, enough for two to four samples depending on the weather and distance. Swap the battery out and drive to the next location and do it all again.

For comparison, Reign Maker pointed to New York’s water authority, which collects 30 samples per day from boats and other methods, at an approximate cost (including labor, boat fuel, etc) of $100 per sample. Workers using Nixie were able to collect an average of 120 samples per day, for around $10 each. Sure, New York is probably among the higher cost locales for this (like everything else) but the deltas are pretty huge. (The dipper attachment itself costs $850, but doesn’t come with a drone.)

It should be mentioned that the drone is not operating autonomously; it has a pilot who will be flying with line of sight (which simplifies regulations and requirements). But even so, that means a team of two, with a handful of spare batteries, can cover the same space  that would normally take a boat crew and more than a little fuel. Currently the system works with the M600 and M300 RTK drones from DJI.

Image Credits: Reign Maker

The drone method has the added benefits of having precise GPS locations for each sample and of not disturbing the water when it dips in. No matter how carefully you step or pilot a boat, you’re going to be pushing the water all over the place, potentially affecting the contents of the sample, but that’s not the case if you’re hovering overhead.

In development is a smarter version of the sampler that includes a set of sensors that can do on-site testing for all the most common factors: temperature, pH, troubling organisms, various chemicals. Skipping the step of bringing the water back to a lab for testing streamlines the process immensely, as you might expect.

Right now Reign Maker is working with New York’s Department of Environmental Protection and in talks with other agencies. While the system would take some initial investment, training, and getting used to, it’s probably hard not to be tempted by the possibility of faster and cheaper testing.

Ultimately the company hopes to offer (in keeping with the zeitgeist) a more traditional SaaS offering involving water quality maps updating in real time with new testing. That too is still in the drawing-board phase, but once a few customers sign up it starts looking a lot more attractive.

Community Highlights: 26th edition

The 26th edition of the Home Assistant Community Highlights! Some interesting
things popped up around our community, which we thought was worth sharing.

Do you want to share something for the next edition?
Information on how to share.

./Klaas
Intern on Home Assistant Energy

Power-up your ESPHome Projects


Speaking of interesting stuff, have you seen what cool new stuff has come to
ESPHome? Read more about it in this blog.

Blueprint of the week


This week’s blueprint is that of danielbook,
who created a blueprint that turns on the lights of a room based on a motion
and brightness sensor. You will no longer be in the dark when you get
home 😉 Try it out! Read more about it on the community forum
or install this automation in your instance with a click on the My button!

Floor3D Card


Last time we shared a dashboard with floorplan, this time the
floor3D card from adizanni,
which you can use to get started with your own floorplan.

Lovelace Dashboard


This week again we have a new Lovelace dashboard for the necessary portion
of inspiration 😄 This time the one from suckfail
which has a dashboard
with a variety of cards. If you want to know more about it, check out the
comment part on Reddit.

Would you also like your dashboard to be in the community highlight? Drop it
on Reddit and maybe I’ll pick it out for the next edition.

Chore Tracker


Are you tired of doing those household chores all the time, or do the kids
just don’t want to unload the dishwasher? Make it more fun with the
chores tracker made by djbrooks022,
where you can earn points for every chore you complete!

Do you want to get started? Then find all the information here.

Got a tip for the next edition?


Have you seen (or made) something awesome, interesting, unique, amazing,
inspirational, unusual or funny, using Home Assistant?

Click here to send us your Community Highlight suggestion.

Also, don’t forget to share your creations with us via Social Media:

See you next edition!

Quindor and DrZzs playing with an ESP32-based QuinLED running WLED

Power-up your ESP8266 and ESP32 projects: browser-based installation and configure Wi-Fi via Bluetooth LE

ESP8266 and ESP32 are microcontrollers made by the Chinese company Espressif.
Microcontrollers are teeny tiny computers with little processor power,
memory and space that can interact with sensors, send infrared commands
and many other things.

With the ESP devices Espressif has achieved something formidable: their devices
have Wi-Fi, are compatible with code for the popular Arduino microcontroller
and they are cheap. Like, $5-including-shipping-from-China-cheap
(AliExpress) or $15 for 2 on Amazon cheap.
So cheap that they are the de facto standard for microcontrollers used in
IoT products, both for manufacturers and creators in the DIY space.

Quindor and DrZzs playing with an ESP32-based QuinLED running WLED
Quindor and DrZzs playing with an ESP32-based QuinLED running WLED
(YouTube)

Microcontrollers are just computers and so are nothing without their software.
Open source software like ESPHome, WLED and
Tasmota allow users to turn their ESP8266 and ESP32 devices into
powerful little machines that can gather information and control devices.
In your home, microcontrollers are the eyes and ears while Home Assistant
is the brain.

But these projects all have a common problem: it is difficult to get started.
We identified three pain points:

  1. Installing the software on the microcontroller.
  2. Connecting the microcontroller to your wireless network.
  3. Configure the software on the microcontroller.

These pain points stand in the way for creators to reach a wider audience. It’s
our mission to make local home automation succeed, and these projects,
and all the possibilities that they unlock, are an important part of this.

Today, we are introducing some things to make using microcontrollers easier.

Using terms everybody understands

We are going to start using words that a user understands instead of forcing
the technical terms on them. Terms like “firmware” and “flashing” are the
correct terminology but for inexperienced users they do more harm than good.
They will make the user feel uncomfortable before they even start.

So instead of “upload firmware” we’ve updated the ESPHome dashboard to talk
about “installing”. We are encouraging other projects to do the same.

Things will get more technical as a user continues playing with microcontrollers.
But this change might just be that little thing why they will actually continue.

ESP Web Tools: Installing projects on your microcontroller via the browser

We have created ESP Web Tools. ESP Web Tools allows project
websites to offer a great onboarding by enabling users to install the software
on their microcontrollers via their browser. All the user has to do is connect
their microcontroller to their computer and hit the install button on the
website. ESP Web Tools will automatically select the right build for your
microcontroller and install it.

This works for both the ESP8266 and ESP32 and with any project for these
devices. This technology is powered by Web Serial, a web standard for serial
communication that is part of Google Chrome and Microsoft Edge.

ESP Web Tools Web has already been adopted as part of the onboarding by
WLED and ESPEasy.

Learn how to add ESP Web Tools to your website

If you have an ESP32 or ESP8266 device handy, you can try it out right here:

ESP Web Tools uses code written by
@MakerMelissa from Adafruit. We’re currently relying
on an enhanced fork that can fit a wider range of use cases, including ours.
We have a pull request open to get our changes contributed back.

ESP Web Tools website

Note: We don’t like to use technology that is not available in all browsers
and cannot be made available in other ways. However, in this case the benefits
outweigh the cons. We hope that Firefox and WebKit add support for Web Serial
in the future.

Improv Wi-Fi: Open standard to provision Wi-Fi credentials via Bluetooth Low Energy

We have created Improv Wi-Fi. Improv Wi-Fi is a free and open standard
that anyone can use to offer a user-friendly way for users to connect their
devices to the wireless network.

Improv Wi-Fi logo

For open source firmware there are two popular ways of getting a device to
connect to your wireless network. The device sets up a wireless network and you
need to connect to it via your phone or laptop, or the user compiles the
network and password into the firmware before installing it on the ESP.
Both methods are difficult and error prone, they offer a bad user experience.

If you look at off-the-shelf products, you see another approach:
send Wi-Fi credentials to the device via Bluetooth Low Energy (BLE). BLE allows
the user to get instant feedback if something goes wrong. This technology is
used in many products, but there is no open standard that is free to implement.
Improv Wi-Fi is an open standard that is free to implement.

Open source projects often host their control interface as a website on the
ESP device. Improv Wi-Fi supports this and when provisioning is done, the user
can be redirected to a URL to finish onboarding.

Improv Wi-Fi can be used today to provision ESP32 microcontrollers running
ESPHome (ESP8266 devices do not support BLE). Users will soon be
able to provision devices with the Improv Wi-Fi service via the Home Assistant
Android and iOS apps. All these implementations are open source and can be used
in your projects.

Improv Wi-Fi is also available for the web in the form of a button that can be
added to your website. This will allow users to configure and set up a device
from any browser that supports Web Bluetooth.

If you’ve used the installation button in the previous section but have not yet
connected it to the wireless network, you can onboard that device here:

Improv Wi-Fi website

ESPHome Dashboard: simplified and streamlined

With ESPHome users don’t program microcontrollers, they configure
them. Tell ESPHome there is a temperature sensor on pin 3 of your ESP device
and ESPHome will install custom software on your ESP device that makes this
information available in Home Assistant.

# Example ESPHome configuration
sensor:
  - platform: dht
    pin: D2
    temperature:
      name: "Living Room Temperature"
    humidity:
      name: "Living Room Humidity"
    update_interval: 60s

Result of how it shows up in Home Assistant with the example ESPHome above

The ESPHome Dashboard has been updated with a simplified and streamlined wizard
for new configurations. You now enter the name of your project and your Wi-Fi
credentials and it will install it on your ESP device via the browser. After
that all further updates will happen wirelessly.

ESPHome: embracing projects

We want to make it easy for creators to sell ESPHome powered products that offer
a great user experience. ESPHome projects embrace local control and integrate
nicely with Home Assistant, and so each extra ESPHome product that our users
can buy is a win.

To make it easier to keep creators and users connected once a product is
installed, projects can now add a project identifier and version to their
firmware (docs). With
today’s release this information will be available in the device information,
logging output and the mDNS discovery info.

The goal is to integrate the projects tighter into the ESPHome dashboard by
showing the project’s logo, link to the documentation and issue pages and allow
installing updates.

Why we build this

Home Assistant’s mission is to make local home automation a viable alternative
to cloud based solutions and accessible to everyone.

To make this mission a reality, we started the company Nabu Casa. Together with
the community, Nabu Casa develops Home Assistant and ESPHome and is funded
solely by people that support this mission. No investors or loans.

If you want to help fund our work, subscribe to Home Assistant Cloud.

metroid-dread

Nintendo teases 2022 release for Breath of the Wild sequel and releases Zelda Game & Watch to tide us over

Nintendo defied expectations today with an E3-timed Direct showing off not the hoped-for new Switch hardware but a dozen or so new games — as well as a general release window for the much-anticipated next Zelda game. And to celebrate the original’s 35th anniversary, it will sell a new Game & Watch featuring the first three games in the series.

Among other things, Nintendo showed off remasters or remakes of titles from the “Monkey Ball,” “Mario Party,” “Advance Wars, “Wario Ware” and other series, and announced new entries in the “Mario + Rabbids” and “Shin Megami Tensei” worlds. Other newly announced or teased games will be making it to Switch as well, like the new “Guardians of the Galaxy.”

Perhaps most surprising was the inclusion of a new side-scrolling Metroid game, the first in nearly 20 years — and in fact, it has been in and out of development for half that time. “Metroid Dread,” the fifth in the mainline series that began on the NES, will release October 8, and we’ll see if Nintendo has managed to keep pace in a genre it pioneered but others have refined.

Image Credits: Nintendo

Everyone was hoping for Zelda news, however, and Nintendo… only slightly disappointed us. As the announcers noted, it’s the 35th anniversary of the NES original, and the perfect time to announce something truly special, but they have “no campaigns or other Nintendo Switch games planned.”

Instead, they offered an admittedly tempting Game & Watch in the style of the one we saw released last year for the Mario series. I had lots of good things to say about that device, and the new one will no doubt be just as fun. The ability to pause the game and pick it up later (but not rewind or save states) should make for a fun, authentic playthrough of the first three games in the Zelda series: “The Legend of Zelda” and “Zelda II: The Adventure of Link” for NES, and “Link’s Awakening” for Game Boy (recently remade).

Image Credits: Nintendo

The last item on the list was a new look at the follow-up to Breath of the Wild, which years after its debut still shines as one of the, if not the, best game on the Switch. Its sequel has a lot to live up to!

While the first trailer was all cinematic, this one showed gameplay and the overworld, including a new level of verticality that brings flying fortresses and castles in the air into play. It certainly looks impressive, but one wonders how much further the company can push its Switch hardware. After all, “Breath of the Wild” pushed the system to its limits at its debut, and even then it was not as powerful as its rivals from Microsoft and Sony — both now replaced by a new generation.

One hopes that Nintendo is simply being weird and has a trick up its sleeve, as it has many times before. The Switch was announced out of nowhere, and previous hardware updates have also dropped with little or no warning and seemingly arbitrary timing. What’s expected is an updated Switch that’s physically the same dimensions but considerably updated inside and using a larger, better display. Perfect backwards compatibility, like with the 3DS series of handhelds, also seems only logical. But Nintendo has always done its own thing and its fans wouldn’t have it any other way.

Kai-Fu Lee’s Sinovation bets on Linux tablet maker Jingling in $10M round

Kai-Fu Lee’s Sinovation Ventures has its eyes on a niche market targeting software developers. In April, the venture capital fund led a $10 million angel round in Jingling, a Chinese startup developing Linux-based tablets and laptops, TechCrunch has learned. Other investors in the round included private equity firm Trustbridge Partners.

Jingling was founded only in June 2020 but has quickly assembled a team of 80 employees hailing from the likes of Aliyun OS, Alibaba’s Linux distribution, Thunder Software, a Chinese operating system solution provider, and China’s open source community.

The majority of the startup’s staff are working on its Linux-based operating system called JingOS in Beijing, with the rest developing hardware in Shenzhen, where its supply chain is located.

“Operating systems are a highly worthwhile field for investment,” Peter Fang, a partner at Sinovation Ventures, told TechCrunch. “We’ve seen the best product iteration for work and entertainment through the combination of iPad Pro and Magic Keyboard, but no tablet maker has delivered a superior user experience for the Android system so far, so we decided to back JingOS.”

“The investment is also in line with Sinovation’s recognition and prediction in ARM powering more mobile and desktop devices in the future,” the investor added.

Jingling’s first device, the JingPad A1 tablet based on the ARM architecture, has already shipped over 500 units in a pre-sale and is ramping up interest through a crowdfunding campaign. Jingling currently uses processors from Tsinghua Unigroup but is looking into Qualcomm and MediaTek chipsets for future production, according to Liu.

On the software end, JingOS, which is open sourced on GitHub, has accumulated over 50,000 downloads from users around the world, most of whom are in the United States and Europe.

But how many people want a Linux tablet or laptop? Liu Chengcheng, who launched Jingling with Zhu Rui, said the demand is big enough from the developer community to sustain the startup’s early-phase growth. Liu is known for founding China’s leading startup news site 36Kr and Zhu is an operating system expert and a veteran of Motorola and Lenovo.

Targeting the Linux community is step one for Jingling, for “it’s difficult to gain a foothold by starting out in the [general] consumer market,” said Liu.

“The Linux market is too small for tech giants but too hard for small startups to tackle… aside from Jingling, Huawei is the only other company in China building a mobile operating system, but HarmonyOS focuses more on IoTs.”

Launching a new operating system is surely an audacious move and has been done before. Linux laptops have been around for years, but Jingling wanted to offer something different by enabling both desktop and mobile experiences on one device. That’s why Jingling made JingOS compatible with both Linux desktop software like WPS Office and Terminal as well as the usual Android apps on smartphones. The JingPad A1 tablet comes with a detachable keyboard that immediately turns itself into a laptop, a setup similar to Apple’s Magic Keyboard for iPad.

“It’s like a gift to programmers, who can use it to code in the Linux system but also use Android mobile apps on the run,” said Liu.

Jingling aspires to widen its user base and seize the Chromebook market about two from now, Liu said. The success of Chromebooks, which comprised 10.8% of the PC market in 2020 and have been increasingly eating into Microsoft’s dominance, is indicative of the slowing demand for Windows personal computers, the founder observed.

The JingPad A1 is sold at a starting price of $549, compared to Chrome’s wide price range roughly between $200 and $550 depending on the specs and hardware providers. Tablets, along with PCs, got a bump in sales during the pandemic, thanks to more people working and learning remotely, but in the long term, Jingling will have to get its pricing right and pin down where it sits in the market.