قالب وردپرس درنا توس
Home / Science / How your digital trail ends in the hands of the police

How your digital trail ends in the hands of the police



Every one of Michael Williams Without his knowledge (even before the fire), he was still tracking the movement. In August, the accused rapist R. Kelly Williams, the partner of the R&B star, allegedly used explosives to destroy the car of a potential witness. When the police arrested Williams, the evidence cited in the Department of Justice affidavit came mainly from his smartphone and online behavior: text messages sent to the victim, cell phone records, and his search records.

Investigators provided Google with a “keyword authorization letter,” asking the company to provide information about any users searching for the victim̵

7;s address at the time of the arson. The police narrowed the scope of the search, identified Williams, and then filed another search warrant against the two Google accounts linked to him. They found other searches: the “explosive nature” of diesel, a list of countries/regions that have not reached an extradition agreement with the United States, and YouTube videos of alleged victims by R. Kelly. Williams pleaded not guilty.

Data collected for one purpose can always be used for another purpose. For example, collect search history data to improve recommendation algorithms or build online profiles instead of catching criminals. usually. Smart devices such as speakers, TVs, and wearables retain such precise details in our lives that they are used as evidence of conviction and exemption in murder cases. Speakers do not need to hear crimes or confessions to be useful to investigators. They keep time-stamped logs of all requests, as well as their location and identity details. Investigators can access these logs and use them to verify the whereabouts of suspects and even hide them in lies.

Not just speakers or wearable devices. In the year when some people in big technology companies promised to support militants demanding police reform, they still sold equipment and apps equipped with apps, so that the government could get more people than traditional warrants and police methods. Allow more secret data.

The November report in Vice found that users of the popular Islamic Pro app may have sold their whereabouts data to government agencies. Any number of applications require location data, such as weather or tracking your exercise habits. The deputy report found that the data broker X-Mode collected data from Muslim Professional users for prayer reminders, and then sold it to others, including federal agencies. Both Apple and Google prohibit developers from transferring data to X-Mode, but it has collected data from millions of users.

The problem is not just any single application, but an overly complex and under-censored data collection system. Apple began in December to require developers to disclose key details about its privacy policy in the “nutrition label” of the app. When users click “Agree” after downloading the application, they “agree” to most forms of data collection, but as we all know, privacy policies are difficult to understand, and people often don’t know what they agree to.

An easy-to-understand summary like Apple’s nutrition label is useful, but even developers don’t know what data their app will eventually collect. (Many developers contacted by Vice admitted that they didn’t even know the user data accessed by X-Mode.)

As we use more standing equipment, clicking “I agree” can eliminate serious privacy issues, so that the pipeline between business surveillance and status surveillance is expanding. This summer, the nationwide debate on policing and racial equality made this quiet cooperation extremely easy. Despite the backward diversity figures, indifference to white nationalism, and abuse of non-white employees, some technology companies are still racing to provide public support for “black life issues” and rethink their relationship with law enforcement.

Amazon invested millions of dollars in ethnic equity organizations this summer, and after years of defending such behavior, it promised to stop (but not stop) selling facial recognition technology to the police. But the company also noticed an increase in police requests for user data, including internal logs kept by its smart speakers.


Source link