قالب وردپرس درنا توس
Home / Technology / Apple provides new auxiliary functions for iOS 14, such as AirPods Pro audio adjustment

Apple provides new auxiliary functions for iOS 14, such as AirPods Pro audio adjustment



apple-iphone-11-9335

Apple has designed new accessibility features for its popular devices.

Angela Lang / CNET

use iOS 14, Apple has brought many new accessibility features to the device, including some features that people with disabilities may also help. The list ranges from the ability to customize the transparency mode in AirPods Pro to the range of capturing multiple frames using the iPhone magnifying glass function. with New Back Tap function Allows you to click on the back of the iPhone for screenshots and other operations.

Many new enhancements may appeal to deaf or hearing impaired people, while other features will benefit blind or amblyopic users, expanding Apple’s efforts over the years to make devices and software easier to use.

These improvements are not only applicable to iPhone and iPad. Apple Watch users can now choose to configure accessibility features during the process of setting up the watch, as well as open the oversized dial for a larger and bolder “complex function” At-a-glance information about weather and other information -Help the visually impaired to treat them better.

On Monday, Apple released iOS 14, iPadOS 14 with Other updated software In its year Global Developer Conference. The company used WWDC to showcase the largest update of its operating system, and then made it available to all Apple device users later this year. Currently, developers and other Beta testers can access earlier versions of the software to make their own applications and help Apple detect errors before the improved features are widely rolled out. This includes auxiliary functions.

The voice recognition feature in iOS 14 allows you to get notifications of alarms, animal noise, doorbells and other sounds. It can be found in the accessibility menu of settings.

Screenshot of Jason Hiner/CNET

The US Centers for Disease Control and Prevention estimates that one in four Americans live in a certain disability. In the past, people with special needs had to spend thousands of dollars to buy technology to enlarge computer screens, speak navigation directions, identify money and identify colors of clothes. Today, users only need a smartphone, computer, and a small number of applications and accessories to help them travel through the real world and online world.

Apple has built accessibility features in its products for many years. The technology it provides can help amblyopic people navigate the iPhone’s touch screen, or allow people with movement disabilities to almost click interface icons. It has a Designed specifically for the iPhone program, certified hearing aids Can be used on its device. Two years ago, Apple provided users with the ability to turn their iPhone and AirPods into remote microphones in the following ways: Its real-time listening function.

iOS 14, iPadOS 14, WatchOS 7 And other upcoming software expands these products.

Hearing function

  • The headphone adjustment function allows users to adjust the frequency of audio played through their AirPods Pro, Second-generation AirPods, Choose Beats headphones and EarPods. Everyone can customize their own settings to attenuate or amplify specific sounds. Users can set up to nine unique profiles (such as movie settings and different call settings), they can use three zoom adjustments and three different intensity.
  • AirPods Pro transparent mode gets its own unique advantage from “headphone adjustment”: the ability to customize the amount of surrounding environment you hear. Quiet sounds can become clearer, and external ambient sounds can become more delicate.
  • Voice recognition makes it easier for deaf and mute people to understand voice-based alarms, alerts, and notifications. When an iPhone, iPad or iPod Touch hears a specific type of sound or alert, it will send a notification to the user’s device (including Apple Watch). The sounds that the system can detect are sirens, such as sirens, home smoke alarms, or building fire alarms; and household noises such as doorbell sounds, car horns, device beeps, and running water. Apple is also committed to detecting the sounds of people or animals.
  • Now, people who use sign language instead of talking can use Group FaceTime to talk. Generally, in a group call, the person who speaks appears more prominently in front of other participants, and the video frame of the person becomes larger and larger. On iOS 14, FaceTime will be able to detect whether someone is using sign language and highlight that person’s video window.
  • Noise application, Introduced in WatchOS 6 last yearThe surrounding sound level can be measured to make the user feel how loud the surrounding environment is. With WatchOS 7, customers will be able to see how loud they are listening to audio through headphones on their iPhone, iPod or Apple Watch. The hearing control panel displays a real-time user interface that shows whether the audio is playing within the limits recommended by the World Health Organization. This function listens to audio at a speed of 80 decibels for about 40 hours per week without harming hearing. When the safe weekly listening volume is reached, the Apple Watch will send a notification to the wearer.
  • Real-time text allows people with hearing or speech impairments to use two-way text for real-time communication during calls. The iPhone has been using RTT since 2017, but Apple now makes it easier for users to perform multitasking when interacting with calls and incoming RTT messages. Even if they are not in the phone application and the RTT conversation view is not enabled, they will be notified.

Visual function

  • Apple’s VoiceOver technology, which converts text on the screen to speech, has been updated on iOS 14. It now uses machine learning and neural engines on Apple’s devices to recognize and describe audibly what is happening on the screen, even if party developers have not yet enabled the feature in their apps. The iPhone or iPad will now automatically provide better optical recognition for more objects, images, text, or controls displayed on the screen, while VoiceOver provides more natural and contextual feedback. When it comes to images or photos, VoiceOver can now read descriptions of contest sentences to elaborate on the content on the screen. And it will automatically detect user interface controls such as buttons, labels, toggles, sliders, and indicators.
  • Rotor, a method of customizing VoiceOver experience based on gestures, can now do more than before. The system already allows users to make adjustments, such as adjusting speech speed and volume, selecting special types of input (such as Braille) or adjusting how VoiceOver moves from one item on the screen to another. WatchOS 7 introduced this technology to the Apple Watch, allowing users to customize characters, words, lines, titles and links. versus MacOS Big Sur, Users can configure Rotor with their preferred braille table and access more options to adjust the code when developing applications using Xcode.
  • Apple’s magnifying glass technology is one of its most commonly used auxiliary functions, it has been upgraded through iOS 14 and iPadOS 14. It now allows users to zoom in on the area they are pointing to and capture multiple freeze frames. They can also filter or highlight images to improve clarity, and capture multiple images at once, making it easier to view multiple pages of documents or longer content at once. The magnifying glass can also perform multitasking on the iPad.
  • Apple’s new software expands support for Braille through automatic braille imposition. It allows users to pan more braille text without having to press the physical pan button on the external refreshable display.

Rear faucet

  • One accessibility feature that many people may end up using is Back Tap. The functions provided in iOS 14 allow iPhone users to perform various quick operations by double-clicking three or three times on the back of the iPhone. Users can open specific auxiliary functions or screenshots. They can also scroll, open the control center, enter the home screen or open the application switcher.
  • One thing that Back Tap is not easy to do is start the camera or take a picture. User can configure these actions First make a Siri shortcut. Shortcut application Introduced two years ago, Automate common tasks and routine tasks. Using shortcuts, people have been able to create custom commands, such as creating a request to aggregate surfing reports, current weather, travel time to the beach, and sunscreen reminders, and just say “Hey Siri, surfing time.” These shortcuts can be mapped to Back Tap settings.

Mobility/Physical Motor Function

  • Apple’s voice control tool has gained new British English and Indian English voices, as well as some new features. Technology, Launched at last year’s WWDCAllow people with restricted movement to browse and operate their devices by issuing voice commands. It enables users to perform operations such as adding emojis when requesting dictated emails, or dividing the screen into a numbered grid so that they can duplicate screen clicks or mouse clicks by calling up numbers. Apple device owners can now use Voice Control with VoiceOver to perform common VoiceOver operations, such as “read all” or “activate” display controls. Apple also has built-in prompts and permanent grids or digital overlays to improve user consistency when using voice navigation devices, and can now separate “sleep/wake” commands when running multiple devices at the same time.

Accessibility coding

  • Apple is expanding the accessibility of its Xcode coding tools. The company’s Xcode Playgrounds and Live Previews will make it easier for blind people to use, similar to Swift Playgrounds coding course Have been visiting for many years. It is also hoped that by making Xcode accessible, it will encourage more people with low vision to become programmers.

Xbox adaptive controller support

  • Apple devices will now support the Microsoft Xbox adaptive controller. This means that people playing games in Apple Arcade, including Apple TV, will be able to use Microsoft’s $100 device, which is designed to make games easier to use. Players can plug switches, buttons, pressure-sensitive tubes, and other devices into the controller to handle any functions normally performed by standard controllers.
  • Apple also supports other popular controllers, including Xbox wireless controllers with Bluetooth, PlayStation DualShock 4 and MFi game controllers. They can also be used with touch controls and Siri Remote.


Now Playing:
look at this:

The biggest announcement of WWDC 2020


1:55


Source link