Skip to main content
  1. Blog
  2. Article

Alex Cattle
on 6 February 2020


Deploying AI/ML solutions in latency-sensitive use cases requires a new solution architecture approach for many businesses.

Fast computational units (i.e. GPUs) and low-latency connections (i.e. 5G) allow for AI/ML models to be executed outside the sensors/actuators (e.g. cameras & robotic arms). This reduces costs through lower hardware complexity as well as compute resource sharing amongst the IoT fleet.

Strict AI responsiveness requirements that before required IoT AI model embedding can now be met with co-located GPUs (e.g. on the same factory building) as the sensors and actuators. An example of this is the robot ‘dummification’ trend that is currently being observed for factory robotics with a view to reducing robot unit costs and fleet management.

In this webinar we will explore some real-life scenarios in which GPUs and low-latency connectivity can unlock previously prohibitively expensive solutions now available for businesses to put in place and lead the 4th industrial revolution.

Watch the webinar

Related posts


Canonical
5 January 2026

Canonical announces Ubuntu support for the NVIDIA Rubin platform

Canonical announcements Article

Official Ubuntu support for the NVIDIA Rubin platform, including the NVIDIA Vera Rubin NVL72 rack-scale systems, announced at CES 2026 CES 2026, Las Vegas. – Canonical, the publisher of Ubuntu, is pleased to announce official support for the NVIDIA Rubin platform and the latest distributions of the new NVIDIA Nemotron 3 open models.  As A ...


Massimiliano Gori
2 March 2026

Supporting more identity providers on Ubuntu with the new Authd OIDC broker

Cloud and server Article

Today we are announcing the general availability of the new generic OpenID Connect (OIDC) broker for Authd. With enterprises needing to centralise access management controls, the ability to choose your own identity solution is paramount. This new broker snap is our answer to that need, allowing Ubuntu Desktop and Server to integrate with ...


Benjamin Ryzman
11 February 2026

What is RDMA?

AI Networking

Modern data centres are hitting a wall that faster CPUs alone cannot fix. As workloads scale out and latency budgets shrink, the impact of moving data between servers is starting to become the most significant factor in overall performance. Remote Direct Memory Access, or RDMA, is one of the technologies reshaping how that data moves, ...