Automotive & motorsport A
s high-quality user experience (UX) and connected experiences become important - particularly for younger buyers - safety has shifted from a premium differentiator to a baseline expectation. McKinsey reported that 42 per cent of buyers are willing to switch cars just to have the most superior Advanced Driver Assistance System (ADAS) capabilities, such as lane-departure warning, automatic emergency braking and parking assistance.
ELECTRICAL EXPERIENCES
Meeting these expectations means that vehicles are becoming as much electronic experiences as mechanical ones.
ADAS features rely on a network of radar, LiDAR, ultrasonic and optical sensors distributed throughout the vehicle. These sensors capture precise, real-time data on distance, position, velocity, surrounding objects and environmental conditions such as fog, low light or high-traffic environments. Consumer demand is also reshaping the in-car digital environment. High-resolution touchscreens, haptic controls, voice assistants and personalised driver profiles are now standard expectations, creating interfaces that mirror the speed and simplicity of personal devices while keeping drivers informed and engaged.
However, delivering seamless experiences depends on the electrical systems that stabilise, convert and process every signal those components generate.
RAW SIGNALS TO REAL-TIME DECISIONS Beneath the visible sensors and user interfaces lies the unseen electrical architecture that enables vehicles to act intelligently and responsively. This consists of signal conditioning circuits and the processing layer, which together transform raw analogue and digital signals from sensors and user inputs into actionable information for the vehicle. At the signal conditioning layer, analogue signals from various sensors are stabilised, amplified and filtered to remove noise and interference. For example, camera modules rely on analogue front-end (AFE) circuits to adjust gain, reduce distortion and ensure accurate, reliable images. Radar and LiDAR units pass their reflected waveforms through high-speed analogue-to-digital converters (ADCs) and precision filters, translating raw echoes into digital point clouds and Doppler data ready for interpretation. Similarly, signals from infotainment interfaces - touchscreens, haptic sensors and microphones - are cleaned and normalised, ensuring responsive control and accurate feedback.
These conditioning stages ensure that user inputs and sensor readings remain electrically stable and accurately represented, even under vibration, temperature fluctuations or strong electromagnetic fields. Once conditioned, these signals enter the processing layer, where they are fused, analysed and converted into actions. In ADAS systems, multiple sensor streams are combined to detect
14 February 2026 Instrumentation Monthly
DRIVING EXPERIENCES, NOT JUST CARS
When Henry Ford wrote, “Any customer can have a car painted any colour that he wants so long as it is black,” he captured an era of uniformity in automotive design. In 2025, consumers do not just want to choose the colour of their car; they want vehicles that deliver human-centric experiences. Here, Ross Turnbull, director of Business Development at application-specific integrated circuit (ASIC) specialist Swindon Silicon Systems, explores how custom integrated circuits support this new innovation.
obstacles, assess distances and trigger safety interventions such as automatic emergency braking. Infotainment processors synchronise touch inputs, graphics rendering, haptic feedback and audio streams, delivering smooth, intuitive interaction without perceptible lag. The processing layer ensures that both sensors and user inputs translate into predictable, safe and seamless experiences. But as the complexity of input data continues to grow, ensuring fast, deterministic processing within constrained power and thermal budgets becomes challenging.
PURPOSE-BUILT PROCESSING Within the processing layer, semiconductor processors, such as Central Processing Units (CPUs) and Graphics Processing Units (GPUs), perform general computation, coordinating data from sensors and user inputs.
CPUs handle sequential processing tasks, running operating systems and coordinating control
flows, while GPUs accelerate parallel workloads such as image or point-cloud processing. However, their general-purpose architecture contains extra logic and memory resources that automotive workloads do not require. This can lead to increased power draw, higher heat generation, unpredictable latency and larger chip area: factors that can compromise real-time, deterministic performance when processing multiple high- bandwidth sensors and user inputs concurrently. ASICs overcome these constraints. Unlike general-purpose CPUs and GPUs, ASICs are specifically designed to execute a narrowly defined set of operations and can be optimised for deterministic, low-latency performance. Their pipelines are precisely configured for targeted operations, with dedicated accelerators for tasks such as high-speed signal processing, data conversion or complex mathematical computation. By integrating AFE circuits, ADCs, DACs and tightly coupled memory on a single chip, ASICs minimise
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70 |
Page 71 |
Page 72 |
Page 73