The Business & Technology Network
Helping Business Interpret and Use Technology
S M T W T F S
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
 
 
 
 
 
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
 
 
 
 
 

House Looks To Make KOSA And COPPA Worse

DATE POSTED:September 18, 2024

If you had that Congress was going to take bad “protect the children” bills and make them worse on your bingo card, congratulations, you’ve won this week’s easiest prediction.

This seems like a not great situation. As mentioned, the House Energy & Commerce Committee is holding a markup today for a bunch of bills. Yesterday, they revealed the various amendments being proposed, including three replacement bills (called Amendments in the Nature of a Substitute or “AINS”) for KOSA, COPPA 2.0, and APRA.

COPPA 2.0 is another problematic “protect the children bill” that takes the already problematic COPPA law and makes it way worse. In the Senate, the vote on KOSA (which was approved 91 to 3) was actually a merged version of KOSA and COPPA 2.0, referred to as KOSPA. So, what happens to both bills is important. The inclusion of APRA is a bit strange as it was an attempt at a comprehensive federal privacy bill that I thought was probably about as good as we might otherwise get, though it involved compromises. But it died a quick death as folks on both sides of the issue hated the compromises. I have no idea why anyone would bring it back now as it just didn’t have the votes.

The new versions of KOSA and COPPA 2.0, however, still do not fix the underlying problems of these bills, and in some ways appear to make them worse. The new version of KOSA adds in new language that will absolutely lead to abuse by Republicans to force companies to remove LGBTQ+ content. At least the version that passed the Senate made some weak attempts to try to insulate the bill against claims that was possible. Not so much here.

There’s a new part of the “duty of care” section (always the most problematic part of the bill) that says that “high impact online companies” need to “prevent and mitigate” the following:

Promotion of inherently dangerous acts that are likely to cause serious bodily harm, serious emotional disturbance, or death.

On its face, this might not seem all that troubling, until you actually think about it. First off, it seems like an attempt to hold websites liable for various challenges that the media loves to report on, though there is scant evidence that they lead to widespread copycat behavior. As we’ve discussed, these kinds of stupid “challenges” predate even the internet. As the CDC has noted, there are reports of such challenges appearing in newspapers dating back to the 1970s.

Did we threaten to put liability on newspapers for that? Of course not. Yes, it’s bad if kids are putting themselves in danger, but when these challenges made the rounds in the 70s and 80s we taught kids not to do stupid stuff. There’s no reason we can’t continue to educate kids, rather than hand off an impossible task to internet companies to magically find and stop all such content in the first place.

As much as people may not like it, most of those challenges are protected First Amendment speech. Holding companies liable for not magically making them disappear is going to be a problem.

Perhaps a bigger deal, though, is the inclusion of “serious emotional disturbance” in that description. The bill includes a definition for this, which might seem better than the vague “mental health disorder” that was in the original KOSA, but here it creates even more problems:

SERIOUS EMOTIONAL DISTURBANCE.—The term ‘‘serious emotional disturbance’’ means, with respect to a minor, the presence of a diagnosable mental, behavioral, or emotional disorder in the past year, which resulted in functional impairment that substantially interferes with or limits the minor’s role or functioning in family, school, or community activities.

So, um, can literally anyone in Congress explain how a social media platform will know whether or not (1) someone using their website has been diagnosed with a “serious emotional disturbance,” or (2) that content on their website will lead to a diagnosis of a “serious emotional disturbance”?

A pretty straight reading of this suggests that social media companies need to get direct access to everyone’s medical records to determine if they’ve been diagnosed with an “emotional disturbance.” And, well, that doesn’t seem good.

Also, given that much of the GOP falsely believes that a child coming out as transgender means that they have a “serious emotional disturbance,” this part of the bill will almost certainly be used (as we warned) to try to force social media companies to remove LGBTQ+ content to avoid being accused of failing to “prevent and mitigate” such “serious emotional damage.”

Right after that section, the new version says that the “duty of care” means that a “high impact online company” must then somehow prevent or mitigate “compulsive usage” when they know a user is a “minor.” The definition for compulsive usage is… well… not great.

COMPULSIVE USAGE.—The term ‘‘compulsive usage’’ means a persistent and repetitive use of a covered platform that substantially limits 1 or more major life activities (as described in section 3(2) of the Americans with Disabilities Act of 1990 (42 U.S.C. 12102(2))) of an individual, including eating, sleeping, learning, reading, concentrating, thinking, communicating, and working.

So, over the past few weeks, I’ve been reading a fascinating book. I ended up staying up noticeably later than I planned to on multiple evenings as I was glued to the book. It also definitely limited my ability to concentrate and think about other stuff as it was so gripping. Is that compulsive usage? It might not be an online service, but if I’d been reading it on the Kindle, would it be?

How does one distinguish compulsive usage that one must “prevent” under this law from… some content people really like a lot?

The problem, of course, is that the bill doesn’t say. This means that companies will take the lazy way out and just block all sorts of content to be safe.

There are more problems, but this “amended” version is deeply, deeply problematic.

The same is true of the updated COPPA 2.0 as well, especially for children who are estranged from their parents, or who may live in a household where a parent does not accept the kids for who they are. Specifically, the bill includes the ability of parents of children and teenagers to obtain basically any information a website has on their children, and to correct or delete that information.

It’s not hard to see how this could go very, very badly. Imagine an LGBTQ+ child who has not come out to their intolerant parents, but who has found communities online that are helpful to them. Parents can demand that internet platforms hand over all the info they provided to the platform:

require the operator to provide, upon the request of a parent of a teen or a teen under this subparagraph who has provided personal information to the operator, upon proper identification of that parent or that teen—

(i) a description of the specific types of personal information collected from the teen by the operator, the method by which the operator obtained the personal information, and the purposes for which the operator collects, uses, discloses, and retains the personal information;

(ii) the opportunity at any time to delete personal information collected from the teen or content or information submitted by the teen to a website, online service, online application, or mobile application and to refuse to permit the operator’s further use or maintenance in retrievable form, or online collection, of personal information from the teen;

(iii) the opportunity to challenge the accuracy of the personal information and, if the parent or the teen establishes the inaccuracy of the personal information, to have the inaccurate personal information corrected; and

(iv) a means that is reasonable under the circumstances for the parent or the teen to obtain any personal information collected from the teen, if such information is available to the operator at the time the parent or the teen makes the request;

Kids have privacy rights too, but not under this bill. The assumption here is that kids have no rights, that kids are the property of their parents, and that parents can get access to basically any content their kids access online, and even change the data on their kids. While there may be cases where that would be appropriate, there are so many when it is not, and the bill makes no effort to distinguish.

All in all, these new versions of the bill don’t fix the problems of the old ones, and in many ways make them worse.