Edge AI inference compute to piggyback on US telecom infra
Big, centralized data center projects in the US are running into delays. New construction methods, tighter standards, a lack of skilled workers and materials, and slow hookups to power and water are all pushing timelines out.
To work around this, some AI workloads are shifting closer to end users, riding on top of existing US telecom infrastructure instead of waiting for new hyperscale capacity. By placing local edge AI inference compute within or alongside telecom networks, providers can make use of already-built sites, power, and connectivity to serve latency-sensitive applications, even as large data center builds stall.
More from Technology
Samsung’s base Galaxy S26 has already been stripped down on camera. YouTube channel PBKreviews published a teardown that walks through the process ste
Amazon is reportedly working on a new smartphone built around its Alexa+ voice assistant, more than ten years after it pulled the plug on the original
Sony is working with AMD to bring AI-powered frame generation to PlayStation in the future, aiming to boost frame rates without new, more powerful har
Apple held advanced talks last year to buy Lux Optics, the company behind the Halide, Kino, Spectre, and Orion camera apps, according to Lux co-founde
Amazon is reportedly taking another shot at phones, this time with an AI-focused device. Reuters says the company is developing a handset built around
AT&T is rolling out a redesigned app that puts its mobile and home-internet services in one place and leans on generative AI to handle customer suppor