Google wants more developers to create smart home experiences, so it’s launching two new tools to make it easier

One day after a Google I/O 2024 keynote that was squarely focused on Gemini and AI, the technology giant is now focusing on the home… specifically, the smart home.

Back in 2023, Google redesigned the Home app for Android and iOS, supporting the Matter smart home standard and more robust controls. In 2024, Google is expanding its developer tools with two new APIs to make it easier to build and integrate the smart home into other apps and experiences.

The new Devices and Structures API and the Automation API join the Commissioning API, which is used to get smart-home gadgets like bulbs, plugs, and countless others online. 

The hope is that developers of all kinds of apps can now create instances where their apps can tap into the smart home ecosystems of homeowners who are using their products and services. Of course, users need to be granted access, but the idea is to make it easy to develop automation and routines that involve smart home gadgets.

For instance, as a food delivery driver approaches, a routine could automatically turn on your front porch lights, which is pretty neat. 

TechRadar chatted with Anish Kattukaran, Head of Product at Google Home & Nest, to learn more about the latest Home APIs.

Two new APIs can deliver intuitive smart home experiences across apps

Will the Matter smart home standard put an end to laggy smart home devices

(Image credit: Shutterstock)

Essentially, Google Home is now being viewed as, and built as, a platform that developers can easily build for and on. Google, with these new tools, aims to take some of the frustration out of building for the smart home, make it easier to integrate with a standard, and then allow someone making a smart home product to adopt a standard easily, but also allow other developers to build unique “innovative experiences” that tie into the smart home. 

In Kattukaran’s words, these ‘experiences’, or routines, can “bridge the digital and the physical worlds.” For example, a Pixel phone could tap into smart lights to help you wind down better when sleep mode is engaged, or a smart home brand like Eve can bring unique automation to Google Home for the first time.

The Devices and Structures API provides access to over 600 million connected devices from just one integration for the developer to use. This could be how a food delivery app is connected to turn on the outdoor porch lights; or, when a home appliance runs out of a cleaning solution, it could instantly reorder one. It has the potential to solve pain points, and from a privacy and security perspective, the owner – aka you – has to grant permission, and can revoke it anytime. 

Google’s new Automation API is intelligent, with GenAI being used to help connect the dots between smart home devices, for instance, Yale taps this to have porch lights turn on when a door is locked – the API paves the way for that integration and experience to be delivered.

The hope is that Google is “untethering the home in a way that any mobile developer can take and integrate with a home in a way that’s easier than was ever possible,” noted Kattukaran. This opens the door so that even developers who don’t design for the smart home can add elements to it. It’s really an excellent use case that opens the door to all sorts of possibilities. 

Google is opening the waitlist for the new home APIs today, May 15, and they’ll get a full launch later this year. However, Google has already shared four of the experiences the will be available to smart home users. Eve Systems will bring its automation to Android for the first time, such as automatically lowering blinds based on a condition. The Pixel team aims to make bedtime mode a bit more tangible by also locking doors, lowering the lights, and even shutting off screens.

ADT, which is likely focused on security more than the smart home, will introduce “Trusted Neighbors” to let a homeowner with a system easily grant secure, temporary access to someone like a family member, friend, or worker. In our conversation, we spoke about possibilities similar to this ADT feature. 

For instance, a hotel or property rental could integrate room unlocking or locking and smart home controls into its own application. It could prepare the space, turn on lights, and adjust the climate based on geofencing or an imputed arrival time. 

Lastly, Google wants to speed up the smart home, and is opening the door for more devices to effectively be home hubs by running a new piece of software: Google Home Runtime. Essentially, many Nest, Android TV, and Google TV devices, as well as many LG TVs, will become Matter-enabled and Thread-supported hubs, allowing these to help process smart home commands and requests locally. AKA does not need to send these into the cloud, potentially speeding up these requests at home and on Wi-Fi. Kattukaran shared that with this software running in a home with a hub, a command like turning on a smart bulb is up to three times faster. 

The outlook on Google Home as a platform

Nanoleaf Essentials smart bulb next to iPhone XR and Google Nest Hub 2

(Image credit: Future)

While this isn’t a new piece of hardware or another redesign – though the app doesn’t need that – it does open up the smart home in an exciting way. Developers of all sorts will be able to integrate and create new, intuitive experiences that can solve everyday problems or make it easier to get your food delivered late at night. It could be something for sports fans, like flashing your lights a certain color when your team scores or integrating with a music streaming service. 

Kattukaran noted his excitement for what developers might build, saying, “What are the kind of things that they are going to build [since] they can now think about the home in a way they never thought about before.”

You Might Also Like