- The AI Verdict
- Posts
- AI legislation sweeps across six states, legal risks in space and copyright law debated
AI legislation sweeps across six states, legal risks in space and copyright law debated
Welcome back to the newsletter dedicated to informing you of the evolving relationship of artificial intelligence and law.
What you need to know this week:
Six states have passed or plan to pass AI legislation this year [Link]
Applications and emerging legal risks of AI use in space [Link]
House committee discuss impact on copyright law, question OpenAI CEO [Link]
Legal challenges and practical solutions for AI-based code generators using open source code [Link]
This week’s AI legal-tech spotlight [Link]
Let’s jump in. [4-5 min read]
Several parts of the world are enacting or strengthening laws to protect consumers from advanced AI tools.
Federal legislation in the US has stalled, leaving it up to local governments to regulate AI tools such as Open AI's ChatGPT.
Only six US states (California, Colorado, Connecticut, Illinois, Maryland, and Virginia) have or will have laws by the end of 2023 to prevent businesses from using AI to discriminate or deceive consumers and job applicants.
New York City has joined in those efforts with an ordinance regulating the use of AI in the hiring process.
The US states with legislation in place target similar protections, including outlawing AI "profiling," requiring businesses to explain the logic used to program AI decision-making capabilities, offering consumers an opt-out option, and preventing employers from collecting a job applicant’s personal data to make hiring decisions (Illinois and New York City).
In Maryland, employers are prohibited from using facial recognition technology to identify job candidates, unless the candidates consent.
California's laws offer the most robust protections for the state's consumers, including blocking profiling, making it unlawful to use online bots to promote sales of goods and services to a person within its borders, and prohibiting bots when used to influence election votes.
Compliance challenges are expected as these laws are applied, and enforcement resources vary by state, with California being best equipped to enforce its laws due to the state allocating enforcement resources.
The space and satellite industry is now extensively using AI due to the need for autonomous operations and the vast amounts of raw data available from space.
Exciting new applications of AI in space include space robotics, avoiding collisions and monitoring space debris, and space exploration.
Space data is being used for many analytics applications, such as geographical information systems, wildlife conservation, and disaster response.
New risks are arising in space, including liabilities for damage caused in space, risks associated with dangerous materials, and the risk of reentry.
Ownership-related risks and regulatory risks are also emerging in space, as existing law and treaties are minimalist, and there is a lack of clarity on which laws should apply.
Contract provisions are being used to manage risks in space, but without a backdrop of existing law, contracts can be lengthy and inevitably have gaps.
Insurance is available for some space risks, but for many new situations, there is no insurance available due to limited data and risk assessments.
New laws and regulations dealing with the space environment are slowly developing but mainly in the area of licensing. The question remains if legislators or regulators know enough about the new environment to pass laws or regulations that will help manage risk without stifling innovation.
The House Judiciary Committee held a hearing to discuss the effects of AI-powered generation tools on U.S. copyright law.
OpenAI's CEO Sam Altman was questioned about the company's programs that use data from human artists to generate new content.
Lawmakers are considering how the government can respond to AI's impact on copyright, and how to balance protecting human creators from copyright infringement by artificial intelligence, while promoting innovation in AI technology.
Congressman Darrel Issa, chair of the panel’s Subcommittee on Courts, Intellectual Property and Internet, said that the U.S. needs to strike a balance between protecting human creators and promoting innovation in AI technology.
The panel's Democratic ranking member, Georgia Representative Hank Johnson, agreed on Issa's point but added that forcing AI models to obtain a license to use copyrighted works creates new questions about how such a system would work, and how creators would be credited and compensated.
The expert panel invited to testify at the hearing said that carefully balanced AI regulation is vital to ensuring that the U.S. remains a leader in moving the technology forward.
While policymakers should take seriously the genuine concerns of content creators that AI-generated works will displace human artists, replacing fair use with a licensing regime would stifle AI development and pose a difficult enforcement challenge.
AI-based code generators use AI models to suggest code and simplify the code development process. However, using open source code to train these models raises potential legal issues.
Open source licenses permit usage of the code with conditions that range from simple compliance obligations to more onerous, substantive requirements.
Using the output of an AI code generator may not lead to infringement claims, but failure to comply with license obligations could be a breach of contract.
If the code output from an AI code generator is covered by a restrictive open source license, using it in another program taints that program and requires the program to be licensed under the same terms.
Failure to comply with open source license terms can result in legal problems such as termination of the license and loss of right to use the open source software.
Known solutions such as filters, code referencing tools and code scanning tools can mitigate legal risks associated with AI-based code generators using open source code.
This week’s AI legal-tech spotlight
Smokeball: Smokeball is a cloud-based law practice management software that provides various features for law firms, including time tracking, document assembly, and task management. It uses AI-powered automation to help lawyers save time and reduce errors in their work.
Thanks for reading. See you next Friday.