Breaking 16:15 Lebanon inspects markets to control fruit and vegetable prices 16:00 Oil above $100 fuels surge in global agricultural prices 15:45 Anthropic sues Pentagon to block AI blacklisting 15:40 G7 weighs record oil reserve release as war drives crude above $100 15:36 Love Brand 2025 | Audi among consumers’ Favorite brands in Morocco 15:30 Trump urges Australia to grant asylum to Iranian Women’s soccer team 15:20 Von der Leyen remarks on Iran war spark backlash among EU diplomats 15:15 South Korea plans fuel price cap amid Middle East tensions 15:00 Macron pledges support for Cyprus amid security tensions 14:50 India adds 24 billionaires, reaching 308 in latest global rich list 14:45 Sarkozy ordered to serve prison sentence in Bygmalion case 14:30 China’s inflation reaches three-year high as producer prices continue to fall 14:20 Iraq seeks World Cup playoff delay after players stranded by airspace closure 14:15 Tangier police seize 503 kilograms of cannabis resin at marina 14:00 Morocco shines on France Télévisions with new music travel show 13:50 Iraq extends airspace closure until March 10 amid regional war 13:45 Morocco emerges as key partner in EBRD green transition strategy for 2026–2030 13:30 Morocco launches Noor Atlas solar program with 305 MW capacity 13:21 Gunfire reported outside Rihanna’s Beverly Hills home, suspect arrested 13:20 Screen overuse raises alarms over child brain development 13:15 Love Brand 2025 | Mustapha Swinga among Moroccans’ favorite influencers 13:00 Several explosions heard in Doha amid escalating regional tensions 12:50 Saudi Arabia cuts oil production as Hormuz crisis disrupts Gulf exports 12:45 Casablanca police deny rumor about disappearance of two children 12:30 NATO intercepts second missile fired from Iran in Turkish airspace 12:20 McLaren criticizes Mercedes over data gap after disappointing Australian GP 12:00 General strike in Brussels: Royal Air Maroc cancels several flights on March 12 11:50 DJI Romo vacuum hack reveals global security risks 11:20 Musk promotes vision of robot driven era of sustainable abundance 10:50 Asian markets tumble as oil surpasses $100 for first time since 2022 10:20 IMF warns Middle East conflict could reignite global inflation 09:50 Startup unveils first full brain emulation controlling a simulated body 09:20 Nintendo launches Mario Day 2026 with rare Switch 2 promotion 08:50 Musk dismisses Anthropic CEO comments about possible AI consciousness 08:20 Israel warns Iran has abandoned restraint in global terror plots 07:50 Chinese scientists set record with high purity two photon quantum device 07:20 Sony ends PC ports for major PlayStation titles as fans back strategy 07:00 Asian stock markets plunge as oil prices surge past $115

Artificial Intelligence: The Enigmatic Foe of Your Privacy

Friday 07 June 2024 - 13:00
Artificial Intelligence: The Enigmatic Foe of Your Privacy

In the realm of technological advancements, the rise of artificial intelligence (AI) unveils a captivating panorama of possibilities. However, this sophisticated technology harbors an unsettling potential to gravely compromise the confidentiality of personal data.

AI and machine learning have transformed a myriad of domains, spanning computing, finance, medical research, automatic translation, and more, expanding with each passing month. Yet, these strides are accompanied by a recurring inquiry: what is the impact of these technologies on our privacy and data confidentiality? Regardless of the AI model in question, their development is fueled by ingesting an astronomical quantity of data, some of which could be highly sensitive.

The Retention of Secrets by AI

One of the principal challenges faced by enterprises training artificial intelligences lies in the inherent capacity of these technologies to learn and memorize intricate patterns derived from their training data. This characteristic, while advantageous for enhancing model accuracy (preventing hallucinations, for instance), poses a significant risk to privacy.

Machine learning models, comprising algorithms or systems that enable AI to learn from data, can encompass billions of parameters, akin to GPT-3 with its staggering 175 billion parameters. These models leverage this vast expanse of data to minimize prediction errors. Therein lies the crux of the issue: during the process of adjusting their parameters, they may inadvertently retain specific information, including sensitive data.

For illustration, if models are trained on medical or genomic data, they could memorize private information that could be extracted through targeted queries, thereby jeopardizing the confidentiality of the individuals concerned. Envision a scenario where a cyberattack or an accidental data breach occurs within the organization possessing these models; malicious entities could potentially disclose this sensitive information.

AI and the Prediction of Sensitive Information

AI models can also harness seemingly innocuous data to deduce sensitive information. A striking example is that of the Target retail chain, which successfully predicted pregnancies by analyzing customers' purchasing habits. By cross-referencing data such as the purchase of dietary supplements or unscented lotions, the model could identify potentially pregnant customers and target them with specific advertisements. This case demonstrates that even mundane data can unveil highly personal aspects of one's privacy.

Despite efforts to limit data memorization, most current methods have proven ineffective. However, there is one technique presently considered the most promising for ensuring a degree of confidentiality during model training: differential privacy. But as you will see, it is far from miraculous.

Differential Privacy: An Imperfect Solution?

To explain differential privacy in simple terms, consider this example: imagine participating in a survey, but you disagree with someone being aware of your participation or responses. Differential privacy introduces a small amount of "noise" or randomness into the survey data, so that even if someone accesses the results, they cannot be certain of your specific responses. It anonymizes the data while allowing for analysis without compromising your privacy.

This method has been adopted by industry titans like Apple and Google. However, even with this protection, AI models can still draw conclusions or make predictions about personal or private information. To prevent such violations, the only solution is to protect the entire dataset transmitted to the organization, an approach known as local differential privacy.

Despite its advantages, differential privacy is not without its limitations. Its primary drawback is that it can induce a significant decrease in the performance of machine learning methods. Consequently, models may be less accurate, providing erroneous information, and are much slower and costlier to train.

Therefore, a compromise must be struck between achieving satisfactory results and providing sufficient protection for individuals' privacy. A delicate balance must be found and, more importantly, maintained as the AI sector continues to expand. While AI can assist you in your daily life, whether for professional, personal, or academic purposes, do not consider it an ally of your confidentiality, far from it.

In summary, AI models can retain sensitive information during training, and even innocuous data can lead them to draw conclusions that compromise privacy. The differential privacy method is employed to limit this phenomenon, but it is far from perfect.


  • Fajr
  • Sunrise
  • Dhuhr
  • Asr
  • Maghrib
  • Isha

Read more

This website, walaw.press, uses cookies to provide you with a good browsing experience and to continuously improve our services. By continuing to browse this site, you agree to the use of these cookies.