Tech Beat by Namecheap – 26 May 2023
Generative AI, the hot-topic subfield of AI, is making headlines because of its ability to create works such as artistic images, realistic videos, and more. However, there is a growing concern about the problem of bias in generative AI. Racial and gender bias appears through the data used to train machine learning algorithms, the algorithms themselves, and the assumptions made by developers. Read this week’s top story to learn more about the serious consequences of bias in AI.
In other news
- Government web surveillance program is expanding in the UK. The use of Internet Connection Records (ICRs) by law enforcement agencies in the UK is highly controversial, with privacy advocates warning of over-retention and misuse of personal data. The UK’s National Crime Agency (NCA) has conducted a successful trial of ICRs, which identified previously unknown individuals accessing illegal images of children. The Home Office has now issued a procurement notice for a “national ICR service,” with defense firm Bae Systems awarded the contract. According to Wired, details of the system are being kept confidential, citing national security and law enforcement grounds. Critics fear that the widespread use of ICRs could compromise personal privacy and civil liberties, even as law enforcement officials argue that they are a valuable investigative tool.
- OpenAI CEO supports laws to mitigate the many risks of his own tech. Sam Altman, the CEO of OpenAI, testified in front of the US Congress and expressed his support for regulatory guardrails for artificial intelligence (AI) technology. The Guardian reports Altman proposed licensing and testing requirements for the development and release of AI models, establishing safety standards, and allowing independent auditors to examine the models before they are launched. He also called for a new regulatory agency for the technology, citing the complexity and rapid advancement of AI. The hearing was less contentious than those featuring other tech CEOs, with lawmakers acknowledging Altman’s calls for regulation and understanding of the risks associated with generative AI.
- SpaceX launches second private mission to space. CNN reports that three paying customers and a decorated NASA astronaut traveled on a SpaceX rocket to spend a week aboard the International Space Station. Organized by Axiom Space, the mission is called AX-2 and features the first woman from Saudi Arabia to travel to space. Axiom Space and Nasa hope their mission will motivate others to participate in private-sector spaceflight. The crew will join seven astronauts on the space station, and they will help them with investigations and science projects, including biomedical research like stem cells.
- EU fines Meta €1.2B for privacy violations. The Irish Data Protection Commission revealed that Meta violated GDPR when the tech giant moved vast swathes of personal data from the EU to the US without protection from Washington’s surveillance practices. According to Politico, Meta has been using a legal instrument known as standard contractual clauses (SCCs) to transfer data to the US. However, requirements to use SCCs have been tightened since 2020, when the European Court of Justice rescinded Privacy Shield, an EU-US data flows agreement, due to fears over US surveillance. Since then, Meta and other international companies have struggled to find a new data flow arrangement. This fine is the largest ever to be imposed over GPR violations.
- Scam ChatGPT apps are increasing in app stores. Researchers from security firm Sophos discovered many apps in Google Play and Apple’s App Store offering access to ChatGPT for a subscription fee, even though ChatGPT already offers a free version on the OpenAI website. The apps often start with a free trial and quickly start charging exorbitant prices for the service. According to Wired, one app called ChatGBT was filled with ads and only allowed users to send three chatbot queries before demanding a monthly $10 subscription. These kinds of scam apps are known as fleeceware, and they count on users not knowing how to manage their app store subscriptions to make money. Most users will delete the app but won’t unsubscribe, so they continue being charged. ChatGPT does have an official app, but it’s only available in Apple’s app store and to US users currently. However, there are plans to roll it out on Android and other countries soon.
Previously in Tech Beat: The problem with fake reviews
Have you ever wondered if those rave reviews you see online are actually legit? Check out our article, Don’t Get Fooled: The Truth About Fake Reviews, where we examine this tricky issue. We talk about how these fake reviews operate, how they can mess with businesses and shoppers alike, and tips on how to spot the real from the fake. It’s a must-read if you want to avoid getting duped while shopping online.
Tip of the week: Keep yourself safe in the app stores
Apps have become an intrinsic part of our lives, making it easier to connect, shop, and entertain ourselves. With millions of apps available on both Apple and Android app stores, choosing the right one — while protecting your privacy and bank account — can be overwhelming. But fret not! There are a few things you can do to select apps wisely and keep your security in check.
- Consider your needs and interests. Think about what you want the app to do for you and what features are essential. For instance, if you love cooking, you may want an app that provides recipes and cooking tips. Avoid apps that require excessive permissions or high levels of access to your personal details that seem unnecessary. Apps that ask for too much info can be data collection scams in disguise.
- Read reviews and ratings before downloading an app. Many users provide valuable feedback on their experience, which can give you insight into the app’s performance, user-friendliness, and reliability. But look for reviews that seem robotic or make similar statements repeatedly. Many fake reviews can be easily spotted when you look at them closely.
- Pay attention to the app’s developer and updates. Check if the developer has a good reputation for providing quality apps and has a history of addressing user complaints. Additionally, ensure that the app receives regular updates to fix bugs or add new features. When in doubt, it’s best to go with a better-known app developer with a history of creating reliable software.