Published in · 10 min read · 1 day ago
--
You’re reading The React Native Rewind #4.
Expo SDK 52 brings a wealth of updates. Before we dive in, let’s quickly address the term everyone’s tired of hearing: the “New Architecture”. Rest assured, Expo SDK 52 isn’t forcing you into the New Architecture — not yet. If you’re upgrading from a previous release, the New Architecture is currently opt-in. However, don’t get too comfortable — SDK 53 is slated to make it the default.
However, like any big leap, it’s worth ensuring your current setup works well before you commit. Oh, and platform updates are here too: iOS now demands a minimum version of 15.1, and Android has bumped minSdkVersion
to 24 and compileSdkVersion
to 35. This update reflects the React Native 0.76 ecosystem alignment.
If this is the first time you’re hearing about the new architecture — which I find highly unlikely — go and read issue #3, where we dive deeper into the concepts.
Lights, Camera, Expo!
The Expo team has been hard at work overhauling their core libraries for handling video, audio, and images. With SDK 52, these rewrites are moving from alpha, to beta, to stable, bringing noticeable improvements to performance, stability, and usability. Supporting not just iOS and Android, but also web and tvOS under a unified API.
Expo Video Stable Release
The expo-video library introduces a modern, optimised approach to video playback, addressing long-standing limitations of the now deprecated Expo AV Video API. Expo Video offers better performance by using platform-native video APIs more effectively, reducing lag and improving playback consistency across devices. It simplifies handling media by introducing a cleaner, more developer-friendly API, which is easier to debug and extend for custom use cases.
One standout feature is its built-in support for video thumbnail generation. Previously, developers relied on expo-video-thumbnails for this functionality, but with Expo Video, it’s integrated, cutting down on external dependencies. This rewrite also ensures full support for web and tvOS, aligning with Expo’s cross-platform vision.
Deprecation: While Expo AV (standing for Expo AudioVideo) combined video and audio handling in one library, this bundling often limited performance and flexibility for video-specific use cases. With Expo Video now offering a dedicated, optimised solution, the <Video/>
component in Expo AV has been deprecated.
Expo Audio Beta Release
The new expo-audio library represents Expo’s latest effort to modernise its ecosystem, providing a simplified and more stable approach to audio handling. Built to replace the expo-av Audio API, this beta version prioritises ease of use for common audio tasks like playback, recording, and event handling.
The rewrite optimises how audio tasks are queued and managed natively, reducing unexpected playback interruptions and improving cross-platform consistency. By limiting niche configurations (e.g., custom buffer sizes or hardware-level processing), Expo Audio minimises potential bugs and instability. While it might not suit apps requiring advanced, low-level audio processing, its streamlined design makes adding reliable audio to most apps faster and less error-prone.
Expo Image v2
The revamped expo-image library, a drop-in replacement for React Native’s <Image />
component on steroids, works seamlessly across iOS, Android, tvOS, and the Web. The new useImage hook is the star of this update, allowing developers to preload images into memory before rendering. This optimisation minimises delays and avoids repeated network or file system requests, ensuring a smoother user experience.
When an image is preloaded with useImage
, it provides metadata (like dimensions) and a shared reference to a native image instance (Drawable
on Android or UIImage
on iOS). These references allow immediate rendering, bypassing costly I/O operations and reducing runtime bottlenecks. Additionally, the onDisplay
event enables precise tracking of when images appear on the screen, offering more control over image-heavy interfaces.
Complementing Expo Image, the expo-image-manipulator library now includes an object-oriented API, which makes editing images (e.g., cropping, resizing, rotating, flipping) more intuitive. Supporting iOS, Android and Web. The output of these operations can seamlessly integrate with the Image component from Expo Image v2.
👉 Expo Image.
👉 Expo Image Manipulator
Expo Live Photo Library
With expo-live-photo, Expo adds native support for Apple’s Live Photos. Currently exclusive to iOS, this library allows developers to render Live Photos in-app, enabling more dynamic and interactive media experiences.
By leveraging native APIs for playback, Expo Live Photos ensures smooth and responsive rendering without requiring third-party solutions. While its scope is limited to iOS, it opens new possibilities for apps that prioritise media-rich content.
Expo Fetch API
Expo 52 introduces expo-fetch API, a new, WinterCG-compliant Fetch API designed as a drop-in replacement for global.fetch
. Fully supporting iOS, Android, tvOS, and Web, it provides a unified solution for multi-platform apps while including features like download streaming for efficient handling of data streams.
FileSystem (Next), Now in Beta
File handling in Expo has taken a leap forward with expo-file-system/next
, a modernised API designed to simplify working with files and directories. The old API’s reliance on asynchronous promises and platform-specific quirks has been replaced with synchronous operations and shared objects, making file interactions more intuitive and performant.
Fully backward-compatible, expo-file-system/next
can be gradually adopted alongside the existing API. This rewrite streamlines workflows, shifting from juggling URIs and manual state to a contextual, object-oriented approach.
With the new API, the process is more declarative and state-aware. The getFile method provides a shared object for the file, which can then be read, written, or manipulated using synchronous methods. This eliminates the need to manage asynchronous operations and makes the code easier to read and maintain.
👉 FileSystem (Old)
👉 FileSystem (next)
Expo DOM Components: <div> in Native Land
We mentioned DOM Components in issue #2, but it’s worth revisiting. Evan Bacon, our saviour, The Full Stack Baconator, Baconbrix, Lord of the Bacon, does an excellent deep dive on the concept in this blog post.
DOM Components blur the line between web and native, allowing you to embed HTML directly in your React Native app. Unlike a standard web view, they let you pass props like normal React components while leveraging web technologies for unique use cases.
This opens up new possibilities, like incrementally migrating web apps to native, embedding unique web libraries like Protomaps, or just proving that <marquee>
still has a purpose in 2024.
Using DOM Components is straightforward. Add the directive "use dom"
at the top of a React DOM component file, and suddenly your <div>
is available to import into a React Native app. While these components render inside a web view, data and props flow asynchronously.
Simply add the “use dom”
directive at the top of your regular React (HTML-based) file:
Inside the native component file, import the web component to use it:
Expo Workflows: Built for EAS, Ready for You
Expo has unveiled Workflows, a CI/CD (Continuous Integration/Continuous Deployment) solution purpose-built for React Native and Expo apps. Unlike GitHub Actions or CircleCI, which are highly customisable but require significant configuration, Workflows integrate seamlessly into the Expo Application Services (EAS) ecosystem. By offering pre-packaged jobs (like running end-to-end tests with Maestro), automated pipelines, and smart machine allocation, Workflows eliminates much of the complexity developers face when setting up mobile-specific CI/CD processes.
For teams already using EAS for builds and deployments, Workflows offers a major advantage: it consolidates CI/CD tasks into a single, specialised platform. Previously, developers often used GitHub Actions or CircleCI to trigger EAS, maintaining two CI systems — one for generic workflows and another for Expo-specific tasks.
You can setup your workflows to trigger build
, submit
, update
, test
in Expo Application Services, seamlessly integrated with your own custom CI logic.
For those already deep in the Expo ecosystem, Workflows offer a simpler, more focused alternative to traditional CI/CD tools. If you’ve been juggling GitHub Actions to orchestrate your EAS pipelines, this streamlines the process by consolidating everything back into EAS.
Splash Screens and What the F&%! are Tinted Icons
Expo SDK 52 brings some much-needed updates to splash screens and support for iOS 18’s controversial tinted icons. With the transition to the SplashScreen API on Android 12 and beyond, splash screens are now smoother and more customisable.
No more awkward transitions or layout jumps; developers can now add fade effects with options like SplashScreen.setOptions({ fade: true, duration: 1000 })
for a polished user experience.
Dark mode support has also officially landed in the expo-splash-screen
plugin, addressing a long-standing feature request. If you’re still relying on the splash
field in app.json
, now is the time to migrate to the plugin-based configuration, as the old approach is being deprecated.
Android developers using full-screen splash images should prepare to switch to smaller assets, like icons, to ensure compatibility.
The splash screen is now setup via the plugins
in app.json
with an additional set of options:
iOS 18 introduced tinted icons, a feature that applies dynamic colour overlays to app icons depending on the system’s colour scheme. While this sparked mixed reactions, Expo has added support for customising these overlays. This means you can now create icons that look great even under Apple’s… creative interpretations of your branding requirements. If you’re not a fan of surprises, this feature ensures your app icon looks intentional rather than algorithmically adjusted.
👉 Tinted Icons in Expo.
👉 Apple’s Tinted Icons.
Edge-to-Edge and the Mystery of the Missing SafeAreaView
We covered Android 15’s edge-to-edge enforcement in issue #3, but here’s a quick refresher: edge-to-edge means your app’s content now extends under the status and navigation bars by default, using the full screen. While this creates a more immersive experience, it also demands precise layout adjustments to prevent UI overlap.
Apps targeting targetSdkVersion
35 must adapt to this behaviour, as previously used props for controlling system bars (like setting their colours) are being deprecated. Initially designed for iOS, <SafeAreaView>
handled similar issues by adding padding, but it was never officially supported on Android and lacked flexibility. With Android’s new default, <SafeAreaView>
is being deprecated.
react-native-safe-area-context is the recommended replacement, offering hooks for dynamic insets and precise layout control across both iOS and Android. For fully immersive layouts, react-native-edge-to-edge, backed by Expo, fills additional gaps to help developers transition seamlessly to edge-to-edge designs. These tools ensure your app is ready for Android 15 while maintaining a clean, polished user experience.
You can find the full Expo 52 release notes here.
React Compiler: Optimising React, Automatically
React Compiler is an experimental tool that automates key optimisations for React apps. By analysing your code, it reduces unnecessary re-renders and trims bundle sizes, offering better runtime performance with minimal manual effort. For universal React apps — especially those leveraging server-side rendering — it simplifies performance by handling memoization automatically.
The React Compiler rewrites components and hooks to enable fine-grained reactivity, removing the need for manual optimisations like React.memo
, useMemo
, or useCallback
. That said, while the React Compiler itself is experimental, its integration with Expo is even newer and currently in the experimental phase. It’s not enabled by default, and its status in Expo remains on hold pending further updates.
To enable React Compiler in Expo, update your app.json
:
AI in Your Pocket with ExecuTorch
React Native ExecuTorch brings Meta’s ExecuTorch framework to React Native, letting you run AI models directly on mobile devices. Forget sending data to the cloud — this library enables privacy-first, low-latency AI execution locally.
ExecuTorch is optimised for running machine learning models on hardware like phones or microcontrollers, making it a perfect fit for apps needing real-time AI performance without compromising user data security.
Authored by Luke Farrell and edited by Friosn.
Heavily inspired by the Bytes JavaScript newsletter — one of our favourites for dev updates — instead, this newsletter is all about diving deeper into React Native and its ecosystem. We still recommend checking out Bytes though, because their meme game is strong: https://bytes.dev
If you’re enjoying this newsletter, why not help us grow? Share it with your friends, family, and even that one coworker who always “forgets” to update dependencies. Use this link: https://www.reactnativerewind.com
We love your feedback, and to show our thanks, we’re highlighting your projects! Keep the ideas coming, and we’ll keep sharing your work. Now, here are this week’s plugs:
- For those who’ve turned coding cash into real-world assets — like property, instead of putting it all into Dogecoin like I did — Zeeshan Khan has shared a React Native app called Acreetr. Which we think stands for A Cool Real Estate Experience, Totally Revolutionised. It helps you manage properties, collect rent, and chat with tenants.
- A special shout-out to the brave souls still speaking the languages of Dependency Injection and OOP in the world of React Native. If you’re curious about how you can further decouple parts of your application separating dependancies out behind adjustable interfaces, check out Obsidian from Guy Carmeli at Wix for a fascinating take on these concepts: https://github.com/wix-incubator/obsidian.