Tech Beat by Namecheap – 30 June 2023
Hustle culture mentality is still widespread in 2023. Whether you’re making extra cash in a side business or becoming your own boss, many believe success can only come at the expense of everything else. Unfortunately, this mentality can lead to physical and mental health problems, not to mention the endless scammers waiting to exploit eager would-be hustlers. Learn more about the negative impacts and how to avoid them in this week’s lead.
In other news
- A lawyer regrets using ChatGPT for legal research. In a lawsuit against airline Avianca, a lawyer named Steven A. The New York Times reports Schwartz used the artificial intelligence program ChatGPT to prepare a court filing. However, the filing was found to contain multiple non-existent legal cases and quotes, prompting the judge to order a hearing for potential sanctions. Schwartz, who has practiced law for three decades, admitted he used the AI tool for legal research and was unaware of its potential unreliability. Legal ethics professor Stephen Gillers stated that the case highlights the need for professionals to verify any information provided by AI software.
- Pangolin-inspired tiny robot heals humans from the inside. Scientists from the Max Planck Institute for Intelligent Systems in Stuttgart have developed biotechnology that takes inspiration from nature to create a tiny soft metal robot capable of navigating the human body. Modeled after the pangolin, the robot can curl up and move fluidly, thanks to its interlocking keratin scales, according to an article in News Atlas. The millirobot, measuring 2 cm long and 1 cm wide, features overlapping scales made of aluminum and a soft layer of polymer with magnetic particles. By manipulating the robot with a low-frequency magnetic field, researchers can roll it up and transport particles like medicine to specific areas in the body.
- The show must go on. Meta has hit the mute button on musical tribute acts that impersonate stars like Freddie Mercury and Dolly Parton. According to reporting in The Guardian, these acts run afoul of Meta’s anti-impersonation policies, and without the ability to use Facebook, the artists are struggling to reach their audiences. However, these impersonators aren’t ready to exit the stage just yet. In a grand show of defiance, they protested Tuesday outside Meta’s HQ.
- Bio-mimicking gel electrodes that outperform metal. MIT reports that their engineers have cooked up a Jell-O-like electrode that is soft and metal-free, mimicking the softness and resilience of biological tissue. The team made this innovative material from a high-performance conducting polymer hydrogel and whipped it into a 3D-printable ink. They then 3D-printed flexible, rubbery electrodes that keep their cool. With successful early tests on rats, this breakthrough could one day kick metal electrodes to the curb and step in as the more biocompatible alternative for medical implants, from post-heart surgery support to pacemakers and deep-brain stimulators.
- High-tech shackles in Fulton Jail. The Fulton County, Georgia Sheriff’s Office partnered with Georgia-based firm Talitrix to introduce a cutting-edge surveillance system to address problems in Fulton County Jail that include inmates sleeping on plastic trays on the floor, cell doors hanging off hinges, and water leaks. Wired reveals that the system includes hundreds of sensors embedded in jail walls and wristbands for inmates, which utilize radio frequencies to monitor inmates’ locations every 30 seconds, record their heartbeats, and create 3D visualizations of inmate interactions. The sheriff’s office and Talitrix assert that this technology can bolster efficiency, particularly in understaffed jails, and enhance safety by allowing staff to be alerted to potential health issues or suicide attempts among inmates. However, critics are concerned about the heightened surveillance and debate whether this technology addresses the underlying issues within the jail or wider criminal justice system.
Previously in Tech Beat: the dark world of doxxing
Doxxing, the practice of publicly revealing someone’s personal information without consent, has escalated in recent years due to the rise of social media. Information, including addresses, phone numbers, and social security numbers, can be procured through hacking, social engineering, or open-source intelligence gathering. While some perpetrators of doxxing claim to act as whistleblowers or vigilantes, the victims often face severe consequences such as harassment, identity theft, and physical harm. To avoid being doxxed, you must be cautious about the personal information you share online, and it’s a good idea to utilize privacy tools like VPNs to safeguard oneself against such attacks. To learn more about doxxing and how to protect yourself, read our article, Doxxing: what it is and why it’s a growing concern.
Tip of the week: Double-check your AI-generated content
When using tools like ChatGPT for research purposes, it’s critical to ensure the accuracy of the information it provides. Double-checking your work is actually a best practice for any type of research-based writing. Whether writing a blog for your company website or creating a guest post for your next content marketing campaign, here are some best practices to help validate your AI chatbot findings.
- Cross-verification. Always cross-check the facts, data, or cases provided by ChatGPT with other reliable sources, such as official databases, well-known publications, industry websites, or published books.
- Identify primary sources. If possible, trace the information to its primary source. Primary sources are usually the most accurate and reliable.
- Expert consultation. For complex or specialized topics, consider consulting an expert in the field to ensure the accuracy of the information. If you don’t know any experts, there’s probably an online community group to connect you.
- Use multiple AI tools. Consider using multiple AI tools to gather information. If they provide the same result, it can increase confidence in the accuracy of the findings. However, remember that AI tools might share similar training data, so this doesn’t replace the need for human verification.