A new report from researchers at AI Now found that AI systems and developers are in need of government intervention and public accountability because the tech industry isn't doing a great job at regulating itself. AI-based tools are being deployed without much thought about the potential negative effects, according to the report, and without much documentation of the positive effects. And these aren’t trial AI products — they’re untested, undocumented AI systems that are deployed and affecting millions of people, from using AI to assess immigrants for criminality, to facial recognition and using AI to grade students. That’s because the framework governing AI right now isn’t capable of ensuring accountability, and in turn, are allowing AI systems to cause real harm to consumers. The companies creating the systems don’t have much to go off of in terms of ethics. Therefore government regulation is needed, as is public accountability, more funding and AI community collaboration. TechCrunch
FDA Wants to Regulate New Types of Medical Devices
The U.S. Food and Drug Association's De Novo pathway is a process to classify new medical devices to provide assurance of safety and effectiveness for use. The process reviews these low to moderate risk devices. The FDA published Dec. 4 the De Novo Classification Proposed Rule. If finalized, the rule would create the procedures and criteria for the De Novo classification process and become part of the Medical Device Classification Procedures. The proposed rule would let the health market keep developing, while giving patients piece of mind that the device is safe and effective to improving health. The rule will provide clearer notes on the De Novo requirements related to the format and content of requests, and the processes and criteria for accepting, granting, declining and withdrawing De Novo requests. Plus, these clarifications can help medical device applicants decide if they should even use this registration pathway, considering its cost. Hopefully it’ll provide a more defined environment for companies looking to classify medical devices.
And this is important because some of the De Novo requests for which the FDA granted marketing authorization in 2017 and 2018 included the first self-fitting hearing aid, the first mobile app to help treat substance use disorders, and the first artificial intelligence-based software used to detect more-than-mild diabetic retinopathy. FDA.gov
Location Data May Be More Personal Than We Think
At least 75 companies receive anonymous but precise location data from the applications on our smartphones when users allow the “enable location” services (typically used for getting local news and weather), according to a recent report by The New York Times. This means businesses are tracking millions of mobile devices in the U.S., and that location data is revealing people’s travels with accuracy of within a few yards. At times, these locations are updated more than 14,000 times a day. Companies then sell, use or analyze the data for targeted advertising. And although businesses say they’re interested in our behavior and location data patterns, not our identities, and that data is connected to a unique ID, those with access to the raw data can still identify a person without consent. The New York Times
Space Needs Infrastructure, Not Just Rockets
Private sector rocket makers like SpaceX are building the big, powerful and technologically advanced rocket ships helping us reach space, but in order to truly commercialize space, sustainable infrastructure is needed. The small but growing companies building that infrastructure get overlooked, but they are creating the means necessary to allow people to safely live and work in space — like satellite traffic management solutions, crew habitats, device connectivity, life support systems and food solutions. For example, New Mexico-based company Solstar is working with existing satellite operators to build a commercial internet network to create accessible Wi-Fi connectivity in space between devices in space and on land. Another company called Cognitive Space is building an AI-driven control system to automate satellite operations in space, so people don’t have to monitor and manage each satellite as traffic management in space becomes more challenging. Forbes
Australia’s Encryption Law Passed, Now What?
Australia’s parliament recently passed legislation allowing the country’s law and intelligence authorities to demand access to end-to-end encrypted digital communications, meaning they can force big tech companies like Apple and Facebook to make encryption backdoors for messaging platforms. But this brings up some major privacy and public safety concerns and global implications: if Australia can make a company weaken the security of its product for law enforcement, that backdoor becomes universal, as the tech industry is universal. This backdoor is then exploited to criminals and governments outside Australia, and if a company creates a backdoor tool for Australian law enforcement, other countries will begin to ask for one, too. For now, Australia is the “testing ground” for this type of encryption-busting legislation, but technologists and privacy advocates say it will inevitably impact global policy. Wired