Apple is reportedly considering a radical redesign for the 20th anniversary iPhone that could feature a completely bezel-less display that curves around all four edges of the device, claims a new report out of Korea.

bezel less iphone concept
ETNews writes that Apple is aiming to use "four-edge bending" display technology for the 2027 iPhone that would curve not just along the left and right sides as seen in some current smartphones, but also wrap around the top and bottom edges. This would create a truly borderless visual experience with content flowing seamlessly across all sides of the device.

This follows a report by Bloomberg over the weekend that said Apple plans to launch a "mostly glass, curved iPhone without any cutouts in the display." The Information last week also cited multiple sources claiming that at least one new iPhone model launching in 2027 will have a truly edge-to-edge display. Bloomberg previously reported that the device would be a Pro model in Apple's 2027 lineup.

Based on the latest report, the ambitious design would eliminate traditional screen borders entirely, potentially marking one of the most significant design shifts in the iPhone's history since the 10th anniversary iPhone X, which saw Apple drop the Home button, introduce a notched display, and adopt an intuitive swipe gesture-based navigation interface.

The new display technology is said to be part of Apple's broader push toward next-generation hardware for the commemorative iPhone. The company is also reportedly looking to develop an OLED display driver IC (DDI) based on 16nm FinFET process technology, instead of the traditional 28nm planar process currently used.

The display driver technology would apparently deliver significant power efficiency improvements, which is likely to be increasingly important as smartphones get thinner and enhanced on-device AI features demand more power. The report also claims that Apple's 20th anniversary iPhone could include pure silicon batteries instead of graphite-based ones for increased energy density.

Apple is expected to begin discussions with OLED suppliers Samsung Display and LG Display about implementing the advanced technologies for the anniversary model, according to the report.

Technical Challenges

Achieving a four-sided curved all-display design would present Apple with some major technical challenges, such as placing the front-facing sensors, cameras, and speakers under the display.

As a stepping stone towards this all-screen design in 2027, The Information on Tuesday reported that the iPhone 18 Pro and iPhone 18 Pro Max will be equipped with under-screen Face ID, with only a small pinhole remaining for the front camera on those devices. Counterpoint Research vice president Ross Young has since corroborated the report.

As for the front-facing camera, several Android phones already feature under-display selfie cameras, and Apple has reportedly been working on its own solution for some time. According to an April 2024 report, LG Innotek – one of Apple's Korean suppliers – is developing under-display cameras that leave no visible hole when inactive. These systems use a "freeform optic" multiple lens array designed to reduce image distortion and improve brightness, compensating for the light loss that typically occurs when a camera sits behind a display.

However, another concern adopting a four-sided curved all-screen design would be structural integrity, since bending the screen on all sides would introduce new stress points that could be more vulnerable to damage from drops. The described design could also require big changes to user interaction. With no bezels to rest fingers on, Apple would likely need to develop new palm-rejection algorithms and potentially revise iOS gesture navigation for edge-sensitive input.

Assuming the reports are accurate, Apple has some major hardware and software hurdles to overcome over the next two years. If the company can pull it off though, such a design sounds like a fitting way to celebrate two decades of Apple's most iconic product.

Tag: ETNews

While the iPhone 17 Pro models are still around four months away, there are already rumors about next year's iPhone 18 Pro models.

iPhone Top Left Hole Punch Face ID Feature Purple
The latest word comes from Counterpoint Research vice president Ross Young, who has a good track record with display-related information for future Apple products. In a social media post today, he relayed more information that suggests the iPhone 18 Pro models will be equipped with an under-screen Face ID system.

"At the SID Business Conference today, OTI Lumionics CEO Michael Helander confirmed that they expect phones with under panel Face ID using their materials to be available for sale in 2026," wrote Young. "This suggests that iPhone 18 Pro models will have under panel Face ID with other brands and models to follow."

Earlier this month, The Information's Wayne Ma also reported that iPhone 18 Pro models will feature under-screen Face ID. He said the devices will have only a small hole in the top-left corner of the screen, to accommodate the front camera. The devices will no longer have a pill-shaped cutout at the top of the screen, according to the report, but is unclear if that means the Dynamic Island feature will be discontinued.

Apple should release the iPhone 18 Pro models in September 2026.

Google today previewed the next-generation version of Android, which has an updated design language that's more expressive, and a range of new features. One of the main new additions is Live Updates, a feature that mirrors Live Activity on the iPhone.

android live updates
Live Updates lets Android users track progress notifications from apps in real-time, much like Live Activity. Google says that Live Updates will work with top delivery, ride share, and navigation apps.

With Live Activity, incoming food deliveries, sports games, and more can be tracked from the ‌iPhone‌'s Lock Screen or with the Dynamic Island, and Google's feature works in a similar way. Android users will see Live Updates on the lock screen and home screen, and can tap into a menu bar option to get more information.

Google is revamping Android with an interface focused on color, movement, and haptics, and it'll be interesting to see how it compares to the rumored iOS 19 redesign that Apple has in the works.


Google took some cues from Apple on security, adding a Find Hub that's similar to the Find My app for tracking people, items, and devices, with Google even teaming up with airlines for luggage recovery. There's also an Advanced Protection option that looks similar to Lockdown Mode, offering additional safeguards against malware and suspicious contact for journalists and politicians.

More new features that are coming to Android 16 will be introduced at Google I/O next week.

Over a billion RCS messages are sent on a daily basis in the United States, Google said today. ‌RCS‌, or Rich Communication Services, is a communication protocol that replaced the prior SMS and MMS messaging standards.

RCS Feature 1
It was developed by the GSM Association, but Google has been championing it for years. Apple held out on adopting ‌RCS‌ for quite some time, but finally added support with the launch of iOS 18. On devices running ‌iOS 18‌, ‌RCS‌ is the default messaging protocol for texts between an iPhone user and an Android user.

Apple's adoption of ‌RCS‌ has undoubtedly increased the number of ‌RCS‌ messages sent per day, but Google's 1 billion figure includes Android to Android text messages along with Android to ‌iPhone‌ text messages (and vice versa).

‌RCS‌ is a notable improvement over SMS and MMS, and it makes for a better texting experience between ‌iPhone‌ users and Android users, as Android users can't take advantage of iMessage. Some of the ‌RCS‌ features:

  • Support for higher resolution photos and videos.
  • Support for larger file sizes and file sharing.
  • Audio messages.
  • Cross-platform emoji reactions.
  • Real-time typing indicators and read receipts.
  • Better group chats with support for removing people.
  • The option to send messages over cellular or Wi-Fi.

For ‌iPhone‌ to ‌iPhone‌ conversations, iMessage is still the default, but ‌RCS‌ has made "green bubble" texting with Android users less of a hassle.

‌RCS‌ is supported by the major carriers in the United States, but some smaller carriers like Boost Mobile, Mint Mobile, Ting, and others have yet to add support.

Ecovacs today announced the launch of its new flagship robot vacuum, the X9 Pro Omni. The robot is able to vacuum and mop floors, and it includes HomeKit integration so it can be controlled using Siri and the Home app.

ecovacs x9 pro omni robot
The X9 Pro Omni is a sensor-laden cleaning bot that can vacuum and mop floors throughout the home, while also using AI to navigate obstacles like pets, items on the floor, furniture, and more. The bot is able to map an entire home with its 3D sensors, detecting walls and using a moving mop and side brush to get debris out of corners and along walls.

Like the prior-generation X8 Pro Omni, the new X9 model features Ecovacs' standout mopping feature, the Ozmo Roller Mop that self washes as it cleans, so that dirty water isn't spread over the floor. The X9 Pro Omni has separate clean and dirty water tanks in the robot itself, which is a unique cleaning feature, plus the mopping brush offers more downward pressure than traditional mop heads for a better clean. The mop is able to lift when the robot encounters carpets, so water does not get on rugs or carpeted areas, and it has hot air drying when cleaning is finished.

As for vacuuming, the X9 Pro Omni uses what Ecovacs calls "Blast" technology, with a 100W high-torque motor and optimized airflow path for better, quieter suction. There is a ZeroTangle brush for pet and human hair that does not get clogged up, cutting down on maintenance.

The X9 Pro Omni refills itself at a base station equipped with clean water, a dust bag, cleaning fluid, and a dirty water reservoir, and all of its functions are controlled via the Ecovacs iPhone app. There are a range of cleaning modes, from just vacuuming to a deep clean, and there are options for using the robot as a home camera that can move from room to room.

The X9 Pro Omni is one of a handful of robot vacuums that offers Matter integration, and with Matter, it is able to connect to ‌HomeKit‌. As of iOS 18.4, Apple's Home app has supported robot vacuums, which means you can ask ‌Siri‌ to vacuum or mop your house. ‌Siri‌ can be used for an overall cleaning, or you can ask ‌Siri‌ to clean specific rooms. The Home app is also available, as is the Ecovacs app.

The X9 Pro Omni can be purchased from the Ecovacs website for $1,300, which includes a limited time $300 discount to celebrate the launch. We'll have a review of the new robot coming in the next few weeks.

Earlier this month, PayPal said that it would debut contactless iPhone payments in Germany, and German iPhone users now appear to be able to use the feature. According to German site iPhone Ticker, some PayPal customers in Germany have access to PayPal as an alternative to Apple Pay.

paypal tap to pay germany
PayPal can be used for NFC tap to pay functionality just like ‌Apple Pay‌, with payments initiated in the same way. PayPal users in Germany are able to set PayPal as the default payment app over ‌Apple Pay‌, using the side button to bring up PayPal as a payment option for one-tap contactless payments in retail locations.

If PayPal is not set as the default payment method, contactless payments can be made by opening up the PayPal app.

PayPal is able to offer direct tap to pay options in Germany because the Digital Markets Act in Europe forced Apple to allow third-party apps to use the ‌iPhone‌'s NFC chip. Apple has historically restricted access to NFC for payment purposes, only allowing contactless payments with ‌Apple Pay‌.

With the new regulations, third-party apps from payment services and banks can access the full functionality of the NFC chip, giving ‌iPhone‌ users an alternative to ‌Apple Pay‌ and the Wallet app.

Germany is PayPal's first test market, but the rollout is likely to expand to other European countries in the coming months. To use PayPal for contactless payments, German ‌iPhone‌ users will need the latest version of the PayPal app and a compatible debit or credit card. PayPal can be used in retail stores where Mastercard payments are accepted.

PayPal's tap to pay options are limited to the ‌iPhone‌, and the feature does not work on the Apple Watch because Apple has not been required to open up NFC on the Apple Watch. PayPal won't be able to bring the contactless payment option to the United States or other countries unless those locations adopt similar rules requiring Apple to expand NFC access.

To attract customers to use PayPal instead of ‌Apple Pay‌, PayPal is offering cashback promotions in the PayPal app, and will eventually add a pay over time feature with options for six, 12, and 24 monthly installments for purchases.

NFC access is available for banking and wallet apps in the European Economic Area, which includes the 27 European Union countries plus Iceland, Liechtenstein, and Norway.

Tag: PayPal

The tvOS 18.5 update that Apple released yesterday adds support for synchronizing Dolby Atmos playback to speakers over AirPlay or Bluetooth, according to Apple's release notes for the update.

Apple TV 2022 Feature Blue
The feature could help address some persistent syncing issue that some Apple TV users have encountered when trying to play audio with Dolby Atmos. There are multiple complaints on Reddit and the Apple Support forums about Dolby Atmos audio syncing issues with sound bars and speakers, including those connected via ‌AirPlay‌ and Bluetooth, such as the HomePod.

On affected devices, users find that dialog and other audio can be slightly out of sync with the content that's playing, so lips do not move correctly when people speak in TV shows and movies, and sound effects like explosions are delayed. Switching to 5.1 surround sound eliminates the problem, but people with expensive audio setups have been understandably disappointed not to be able to use Dolby Atmos.

Apple says that the new Dolby Atmos synchronization feature for ‌AirPlay‌ and Bluetooth speakers can be found by going to Settings > Video and Audio > Wireless Audio Sync.

Related Roundup: Apple TV
Buyer's Guide: Apple TV (Don't Buy)

Remotely controlling the shutter on your iPhone's camera lets you include yourself in the photo while avoiding the limitations of a selfie. For example, it allows you to take a picture of a wider scene with you included in the frame, which is ideal for landscape shots or group photos. If your iPhone is on a tripod, taking the shot remotely also reduces the risk of camera shake. Here's how to do it – and you don't need an Apple Watch.

iphone12protriplelenscamera
There are more ways than one to take a picture on your iPhone remotely. If you have an Apple Watch, you can open the Camera Remote app that comes included in watchOS (see the second set of steps below). If you don't have an Apple Watch, here's how to use Voice Control.

Use Voice Control to Take a Photo

If you don't have an Apple Watch, don't worry. You can also use Voice Control to remotely control the camera on your iPhone. That's because you can trigger the camera shutter with the volume buttons, which can also be controlled with your voice. Here's how it works.

  1. Launch the Settings app and tap Accessibility.
  2. Tap Voice Control.
  3. Turn on the switch next to Voice Control so it's in the green ON position. (You should see a Voice Control active symbol and a little orange dot icon at the top of the screen indicating that Voice Control is using the microphone.)
  4. Next, launch the Camera app and line up your shot.
  5. When you're ready, say "Turn up the volume," to activate the camera's shutter and take the picture.
  6. When you're finished, you can disable Voice Control by toggling the switch again in Settings.

settings

Control Your iPhone's Camera With Apple Watch

  1. Launch the Camera Remote app on your wrist.
  2. Position your iPhone to frame the shot you want to take.
  3. Tap the Shutter button on your Apple Watch screen.

camera remote
By default, the shot is taken after three seconds to give you time to move into position, but you can disable the time and control other settings including flash, and Live Photo by tapping the ellipsis (three dots) button. The menu that this calls up also lets you switch between the front and rear iPhone camera.

Final tip: If you're familiar with the Shortcuts app, you can find a shortcut in the Gallery called "Say Cheese" that lets you use Siri to control your iPhone's camera remotely. Once you've added it to your active shortcuts and given it permission to access your camera and microphone, you'll be able to take photos remotely by saying "Hey Siri, say cheese."

Apple and Universal Music Group today jointly introduced a new "Sound Therapy" collection of wellness playlists, consisting of popular songs with added sound waves or white noise to help listeners focus, relax, and sleep better.

Apple Music Sound Therapy
Sound Therapy features three categories: Focus, Relax, and Sleep. The playlists include extended, instrumental, and reimagined versions of popular tracks from artists such as Imagine Dragons, Katy Perry, Kacey Musgraves, and others.

The playlists were crafted by a team of producers, scientists, and audio engineers at Sollos, a music-wellness venture within Universal Music Group.

Apple's announcement explains further:

Songs have been enhanced with auditory beats or colored noise to help encourage specific brain responses. Gamma waves and white noise — a whoosh-like combination of every sound frequency — may help with focusing; theta waves could aid in relaxation; and delta waves and pink noise — a deeper, gentler variation akin to rain or wind — might assist in achieving better sleep. A dreamy version of Katy Perry's "Double Rainbow," for example, could help listeners drift off to sleep, while an Imagine Dragons track might help them tackle a to-do list.

The playlists are powered by Universal's proprietary audio technologies, and they are backed by scientific research, according to Apple.


"Sound Therapy harnesses the power of sound waves, psychoacoustics, and cognitive science to help listeners relax or focus the mind," says Apple's announcement.

Sound Therapy is available exclusively on Apple Music.

iOS 19 will not be unveiled until June, but Apple today previewed a long list of new accessibility features that will be coming with the software update, including two CarPlay enhancements that can benefit both drivers and passengers.

CarPlay Hero
We already highlighted some of the key new features, ranging from a new Accessibility Reader to a Magnifier app on the Mac. Below, we have pasted Apple's entire list of additional features, including some of the smaller ones.

For CarPlay, this includes support for the Large Text option that has long existed on iPhones. Apple is also expanding the Sound Recognition feature for drivers or passengers who are deaf or hard of hearing. CarPlay will be able to provide a notification if it hears a crying baby inside the vehicle, and it will also be able to alert users to sounds outside the vehicle, such as horns and sirens from police cars, ambulances, and fire trucks.

Here is the complete list of additional features, as worded by Apple:

  • Background Sounds becomes easier to personalize with new EQ settings, a timer to stop sounds automatically, and new automation actions in Shortcuts. Helps minimize distractions and may aid tinnitus.
  • Personal Voice now creates a voice in under a minute using 10 phrases, with more natural results thanks to on-device AI. Adds support for Spanish (Mexico).
  • Eye Tracking users on iPhone and iPad can use a switch or dwell to make selections. Keyboard typing is improved across iPhone, iPad, and Vision Pro with a dwell timer, fewer steps, and QuickPath.
  • Head Tracking allows users to control iPhone and iPad with head movements.
  • Brain Computer Interfaces (BCIs) now supported via a new Switch Control protocol for users with severe mobility disabilities.
  • Assistive Access adds a simplified Apple TV app and an API for developers to create tailored apps for users with intellectual and developmental disabilities.
  • Music Haptics becomes more customizable, allowing users to feel haptics for vocals only or an entire song, with adjustable intensity.
  • Sound Recognition adds Name Recognition to alert users when their name is called.
  • Voice Control adds a programming mode in Xcode, vocabulary syncing, and expands language support to Korean, Arabic (Saudi Arabia), Turkish, Italian, Spanish (Latin America), Mandarin Chinese (Taiwan), English (Singapore), and Russian.
  • Live Captions adds support for English (India, Australia, UK, Singapore), Mandarin Chinese (Mainland China), Cantonese (Mainland China, Hong Kong), Spanish (Latin America, Spain), French (France, Canada), Japanese, German (Germany), and Korean.
  • CarPlay now supports Large Text and enhanced Sound Recognition to detect a crying baby, horns, and sirens for deaf or hard-of-hearing users.
  • Share Accessibility Settings lets users quickly share their accessibility preferences with another iPhone or iPad — ideal for temporary device use in public or shared spaces.

iOS 19 will be unveiled during the WWDC 2025 keynote on Monday, June 9. Following months of beta testing, the software update should be released in September, bringing these features to the masses. Many of the accessibility features will also be available on iPadOS 19, macOS 16, watchOS 12, and visionOS 3.

Related Roundups: CarPlay, iOS 19

Amazon today has low prices across nearly the entire M4 MacBook Air lineup, with up to $167 off both 13-inch and 15-inch models. Many of the notebooks in this sale are seeing delayed delivery estimates, with most arriving in late May or early June, but if you're interested you can lock in these deals now ahead of those delivery dates.

M4 MacBook Air 13 and 15 inch Feature Pink and YellowNote: MacRumors is an affiliate partner with some of these vendors. When you click a link and make a purchase, we may receive a small payment, which helps us keep the site running.

Starting with the 13-inch models, Amazon has up to $155 off all three of the new configurations of this notebook. Prices start at $849.00 for the 256GB model, then raise to $1,049.00 for the 16GB/512GB model and $1,245.26 for the 24GB/512GB model. All of these are solid second-best prices on the M4 MacBook Air, and only a few dollars higher compared to their all-time low prices..



Moving to the larger display models, Amazon has both 512GB versions of the 15-inch M4 MacBook Air on sale this week, as well as the 256GB model. The 16GB/512GB model is available for $1,249.00 and the 24GB/512GB model is on sale for $1,432.00. Across the board, these are all record low prices on the 15-inch M4 MacBook Air.



If you're on the hunt for more discounts, be sure to visit our Apple Deals roundup where we recap the best Apple-related bargains of the past week.


Deals Newsletter

Interested in hearing more about the best deals you can find in 2025? Sign up for our Deals Newsletter and we'll keep you updated so you don't miss the biggest deals of the season!

Related Roundup: Apple Deals

iOS 18 introduced an accessibility feature called Music Haptics that has value for everyone. When the feature is turned on, the iPhone's Taptic Engine taps and vibrates to match the audio of a song playing in Apple Music, Shazam, and supported third-party apps, so long as the device is connected to a Wi-Fi or cellular network.

iOS 18 Music Haptics
With iOS 19, Music Haptics will get better in two ways.

Apple today announced that Music Haptics will be even more customizable starting later this year. First, users will have the option to receive haptic feedback for vocals only. Second, users will be able to adjust the overall intensity of taps, textures, and vibrations. These enhancements are expected to roll out with iOS 19, which will be unveiled during the WWDC 2025 keynote on June 9 and released to the general public in September.

Music Haptics is supported on the iPhone 12 and newer, excluding the latest iPhone SE.

Related Roundup: iOS 19

A few years ago, Apple introduced a Personal Voice feature that allows those at risk of losing their ability to speak to create a synthesized voice that sounds similar to their actual voice, so they can continue to communicate with others. The feature debuted on the iPhone with iOS 17, and it will be getting even better on iOS 19.

Personal Voice iPhone Feature
Apple today announced that Personal Voice will be faster and easier to use on iOS 19, thanks to advancements in on-device machine learning and artificial intelligence. Apple says users will be able to create a smoother, more natural-sounding voice in less than a minute, down from 15 minutes when the feature initially launched.

Personal Voice will also add support for Spanish (Mexico), according to Apple.

Personal Voice integrates with another accessibility feature called Live Speech, which lets users type what they want to say to have it be spoken aloud during in-person conversations, phone calls, and FaceTime video calls.

Personal Voice is also available on the iPad and Mac, and the enhancements to the feature will extend to iPadOS 19 and macOS 16.

Related Roundup: iOS 19

Apple is planning to allow users to natively control iPhones, iPads, and other devices using brain signals later this year, The Wall Street Journal reports.

Apple Logo Spotlight Blue
The initiative involves a partnership with Synchron, a neurotechnology startup that produces an implantable brain-computer interface (BCI) device called the Stentrode. The Stentrode enables users with severe motor impairments, such as those caused by amyotrophic lateral sclerosis (ALS), to control Apple devices using neural signals detected from within blood vessels located above the brain's motor cortex.

The Stentrode is implanted through the jugular vein and rests inside a blood vessel on the surface of the brain. The device contains 16 electrodes that can detect motor-related brain activity without requiring open-brain surgery. These neural signals are then translated into digital commands that allow users to interact with an interface.

Synchron has implanted the Stentrode in ten patients since 2019 under the FDA's investigational device exemption. One test participant based in Pennsylvania with ALS, who cannot use his arms or hands, is able to use the Apple Vision Pro and other Apple devices through thought alone, although it is slower than conventional input mechanisms.

In 2014, Apple introduced the "Made for iPhone" hearing aid protocol as a Bluetooth standard that enables seamless wireless communication between hearing aids and Apple devices. The company is now apparently pursuing a similar approach with brain-computer interfaces, aiming to establish a dedicated industry standard in collaboration with Synchron.

Apple is apparently planning to add support for BCIs into its existing Switch Control accessibility framework, which allows input from non-standard devices such as joysticks and adaptive hardware. The company reportedly intends to release this new standard later in 2025.

Synchron's approach differs significantly from that of other companies such as Neuralink, which is developing a more invasive implant called the N1. Neuralink's device contains more than 1,000 electrodes embedded directly into brain tissue, providing a higher-resolution neural data stream. This allows for more complex control, including moving a cursor across a screen and typing using mental intention.

See Synchron's full press release for more information.

Apple today previewed a wide range of new accessibility features coming later this year on the iPhone, iPad, Mac, Apple Watch, and Apple Vision Pro. The announcement comes two days ahead of Global Accessibility Awareness Day.

iOS 19 Accessibility Feature
These features are expected to debut across iOS 19, iPadOS 19, macOS 16, watchOS 12, and visionOS 3, which will be unveiled during the WWDC 2025 keynote on Monday, June 9. Following months of beta testing, the software updates should be released in September, bringing the new accessibility features to the masses.

iOS Accessibility ReaderAccessibility Reader

Some of the key new features:

  • Accessibility Nutrition Labels in the App Store, which will highlight accessibility features within apps and games.
  • Apple is expanding its Magnifier app to the Mac, allowing users to connect an iPhone or USB camera to zoom in and read text on objects around them.
  • Vehicle Motion Cues are expanding to the Mac, to reduce motion sickness in a vehicle.
  • A new system-wide Accessibility Reader tool on the iPhone, iPad, Mac, and Apple Vision Pro will make text easier to read for users with low vision or dyslexia. The feature will give users new ways to customize text and focus on content they want to read, with extensive options for font, color, and spacing.
  • Live Captions are coming to the Apple Watch, allowing users to read a transcription of what their iPhone hears on their wrist.
  • An enhanced Zoom feature on the Apple Vision Pro will allow users to magnify everything in view, including their surroundings, using the device's main camera.
  • Braille Access will turn the iPhone, iPad, Mac, and Apple Vision Pro into a full-featured braille note taker.
  • Personal Voice will become faster and easier to use, thanks to advancements in on-device machine learning and artificial intelligence. Apple says the feature will be able to create a smoother, more natural-sounding replication of your voice in less than a minute, using only 10 recorded phrases.
  • CarPlay will support the Large Text option, and Sound Recognition will be able to identify a crying baby and sirens passing by.

Mac MagnifierMagnifier on the Mac

More details about these features and many others can be found in Apple's press release.

Related Roundup: iOS 19

Foreign-branded smartphone shipments in China, dominated by Apple's iPhone, dropped dramatically in March 2025, plunging 49.6% year-over-year according to data released by The China Academy of Information and Communications Technology (CAICT).

Apple iPhone 16 family lineup
The steep decline saw shipments fall to just 1.89 million units, down from 3.75 million during the same period last year. That shrinks Apple's share of the Chinese market to approximately 8%, while domestic brands now control 92% of smartphone shipments.

For the entire first quarter, non-Chinese brand shipments declined over 25%, while total smartphone shipments in China actually increased by 3.3%.

Apple's struggles come as domestic competitors have gained ground. Counterpoint Research reports Huawei now leads with a 19.4% share, followed by Vivo (17%), Xiaomi (16.6%), and Oppo (14.6%). Apple has slipped to fifth place with 14.1%.

Several factors are driving Apple's declining fortunes. The company faces competition from rejuvenated local brands like Huawei, which has rebounded with proprietary chips and its HarmonyOS Next software.

Chinese government policies appear to be playing a role too. Under government subsidies, consumers of electronics get a 15% refund of products that are priced under 6,000 yuan ($820). Apple's standard iPhone 16 starts at 5,999 yuan.

In response to the declines, Apple is reportedly cutting prices on some iPhone 16 Pro models ahead of China's "618" shopping festival.

Apple CEO Tim Cook acknowledged the challenges during his recent earnings call, noting that revenue from Greater China dropped 2% in the quarter ending March 2025. That was actually an improvement compared to the 11% decline during the 2024 holiday season.

Analysts also believe Apple's slower adoption of generative AI features is a disadvantage in the innovative Chinese market.

(Via DigiTimes.)

Tag: China

Samsung today introduced the Galaxy S25 Edge, an ultra thin smartphone that will compete with Apple's upcoming iPhone 17 "Air." The Galaxy S25 Edge features a 6.7-inch AMOLED display and it measures in at 5.8mm thick.


Comparatively, rumors suggest that the iPhone 17 Air will have a 6.6-inch display and a thickness of 5.5mm, so it may be slightly smaller and thinner than the S25 Edge. The Galaxy S25 Edge weighs 163 grams, so the thin design and the light weight are noticeable when it is compared to a standard Galaxy S25 Ultra or a current iPhone 16 Pro Max. The Galaxy S25 Ultra weighs 218 grams, while the ‌iPhone 16 Pro‌ Max weighs 227 grams.

Samsung is using a titanium frame for the Galaxy S25 Edge, which is also what we're expecting for the ‌iPhone 17 Air‌. It also includes Corning Gorilla Glass Ceramic 2, which Samsung says improves resilience.

galaxy s25 edge colors
While the ‌iPhone 17 Air‌ will only get a single 48-megapixel Wide lens, Samsung equipped the Galaxy S25 Edge with a dual-lens camera setup. There's a 200-megapixel wide-angle lens and a 12-megapixel ultra-wide lens.

Aside from the thin and light design, the Galaxy S25 Edge is basically identical to the other smartphones in the Galaxy S25 lineup, offering the same performance and feature set. It includes the Snapdragon 8 Elite Mobile Platform from Qualcomm, and a revamped vapor chamber for heat dissipation.

galaxy s25 display
It has AI tools like Drawing Assist and Audio Eraser, along with ProScaler for improved image scaling functionality. There's a Now Brief and Now Bar with AI-updated information that changes throughout the day and incorporates info from third-party apps, plus it includes Google Gemini features thanks to Samsung's partnership with Google.

galaxy s25 thickness
Samsung's Galaxy S25 Edge can be pre-ordered from the Samsung website, and pricing starts at $1,099.99 for the 256GB model. Samsung is offering a $50 credit for those who pre-order, and upgrading anyone who purchases the 256GB model to 512GB. The deal is available through May 30.

Samsung is also offering up to $630 in trade-in credits toward the purchase of a Galaxy S25 Edge.

Apple is planning to implement a change to pasteboard (aka your iPhone's internal clipboard) that will prevent Mac apps from being able to read the pasteboard without the user being alerted, according to information Apple has shared with developers.

apple developer app feature
In macOS 16, Mac users will get an alert when a Mac app reads the pasteboard without direct user interaction. This change means apps won't be able to surreptitiously view the things you've copied and pasted.

Mac users won't see an alert with a direct pasteboard-related action, like when copying and pasting text within an app that supports it. Users will be notified if an app tries to view pasteboard data when the paste feature hasn't been used.

Apple says that the Mac pasteboard will work similarly to the iOS pasteboard going forward. On the ‌iPhone‌ and iPad, Apple blocks apps from snooping on pasteboard data, and has done so since iOS 14 after security researchers found that dozens of popular iOS apps were reading the contents of the pasteboard without user consent.

Apple addressed the problem by adding a banner that notifies you when an iOS app accesses the clipboard. In iOS 15, Apple further enhanced the feature by introducing a secure paste option that prevents developers from seeing the clipboard entirely unless you copy something from one app and paste it into the app you're actively using.

With the upcoming Mac changes, Mac developers will be able to "examine the kinds of data" on the pasteboard without actually reading them, improving pasteboard privacy. Pasteboard data used with the privacy-focused API won't show the alert to end users. From Apple's notice to developers:

Prepare your app for an upcoming feature in macOS that alerts a person using a device when your app programmatically reads the general pasteboard. The system shows the alert only if the pasteboard access wasn't a result of someone's input on a UI element that the system considers paste-related. This behavior is similar to how UIPasteboard behaves in iOS.

New detect methods in NSPasteboard and NSPasteboardItem make it possible for an app to examine the kinds of data on the pasteboard without actually reading them and showing the alert. NSPasteboard also adds an accessBehavior property to determine if programmatic pasteboard access is always allowed, never allowed, or if it prompts an alert requesting permission. You can adopt these APIs ahead of the change, and set a user default to test the new behavior on your Mac.

Apple software engineer Jeff Nadeau mentioned on Mastodon that Apple has come across Mac apps that are continuously scraping the pasteboard in the background, but at the same time, there are apps that need pasteboard manipulation, which is why Apple has designed the new APIs.

Mac apps will also need to get user permission to access the pasteboard in some situations. Apple says that developers are able to test the upcoming pasteboard changes with their apps ahead of when the functionality rolls out to users.