Edge AI shifts more processing onto devices across IoT systems
More AI work in IoT is moving from the cloud onto devices themselves. Chipmakers are building processors that let cameras, sensors, and other embedded hardware run models locally, cutting latency and reducing constant data transfers.
Demonstrations at Embedded World 2026 showed edge-focused hardware aimed at running inference on the device, rather than streaming raw data back for processing. For companies running connected systems, this model can speed up responses, lower bandwidth use, and change how they architect and manage IoT deployments.
More from Technology
AMD has signed a deal to use Samsung’s next-generation HBM4 high-bandwidth memory in its upcoming AI-focused hardware. The memory will power the Insti
Orange Business is pitching itself as a safe pair of hands for enterprises facing new risks from AI, cloud and communications tools. At its 2026 custo
Nokia is working with T-Mobile US, Nvidia and other partners on trials of so‑called AI-RAN technology, which applies artificial intelligence to radio
Vodacom Tanzania has introduced what it says is Africa’s first contactless payment feature built directly into a mobile money service, adding tap-to-p
A new white paper from GITEX AI Europe and research firm LUE lays out what they see as the core building blocks for European digital sovereignty. The
A new paper from the GSMA and GTI Telecom argues that the next phase of the digital economy will hinge on how well mobile networks and artificial inte