The Business & Technology Network
Helping Business Interpret and Use Technology
«  
  »
S M T W T F S
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
10
 
11
 
12
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 
 
 
 

Court To Indiana: Age Verification Laws Don’t Override The First Amendment

DATE POSTED:July 1, 2024

We keep pointing out that, contrary to the uninformed opinion of lawmakers across both major parties, laws that require age verification are clearly unconstitutional*.

* Offer not valid in the 5th Circuit.

Such laws have been tossed out everywhere as unconstitutional, except in Texas (and even then, the district court got it right, and only the 5th Circuit is confused). And yet, we hear about another state passing an age verification law basically every week. And this isn’t a partisan/culture war thing, either. Red states, blue states, purple states: doesn’t matter. All seem to be exploring unconstitutional age verification laws.

Indiana came up with one last year, which targeted adult content sites specifically. And, yes, there are perfectly good arguments that kids should not have access to pornographic content. However, the Constitution does not allow for any such restriction to be done in a sloppy manner that is both ineffective at stopping kids and likely to block protected speech. And yet, that’s what every age-gating law does. The key point is that there are other ways to restrict kids’ access to porn, rather than age-gating everything. But they often involve this thing called parenting.

Thus, it’s little surprise that, following a legal challenge by the Free Speech Coalition, Indiana’s law has been put on hold by a court that recognizes the law is very likely unconstitutional.

The court starts out by highlighting that geolocating is an extraordinarily inexact science, which is a problem, given that the law requires adult content sites to determine when visitors are from Indiana and to age verify them.

But there is a problem: a computer’s IP address is not like a return address on an envelope because an IP address is not inherently tied to any location in the real world but consists of a unique string of numbers written by the Internet Service Provider for a large geographic area. (See id. ¶¶ 12–13). This means that when a user connects to a website, the website will only know the user is in a circle with a radius of 60 miles. (Id. ¶ 14). Thus, if a user near Springfield, Massachusetts, were to connect to a website, the user might be appearing to connect from neighboring New York, Connecticut, Rhode Island, New Hampshire, or Vermont. (Id.). And a user from Evansville, Indiana, may appear to be connecting from Illinois or Kentucky. The ability to determine where a user is connecting from is even weaker when using a phone with a large phone carrier such as Verizon with error margins up to 1,420 miles. (Id. ¶¶ 16, 19). Companies specializing in IP address geolocation explain the accuracy of determining someone’s state from their IP address is between 55% and 80%. (Id. ¶ 17). Internet Service Providers also continually change a user’s IP address over the course of the day, which can make a user appear from different states at random.

Also, users can hide their real IP address in various ways:

Even when the tracking of an IP address is accurate, however, internet users have myriad ways to disguise their IP address to appear as if they are located in another state. (Id. ¶ B (“Website users can appear to be anywhere in the world they would like to be.”)). For example, when a user connects to a proxy server, they can use the proxy server’s IP address instead of their own (somewhat like having a PO box in another state). (Id. ¶ 22). ProxyScrape, a free service, allows users to pretend to be in 129 different countries for no charge. (Id.). Virtual Private Network (“VPN”) technology allows something similar by hiding the user’s IP address to replace it with a fake one from somewhere else.

All these methods are free or cheap and easy to use. (Id. ¶¶ 21–28). Some even allow users to access the dark web with just a download. (Id. ¶ 21). One program, TOR, is specifically designed to be as easy to use as possible to ensure as many people can be as anonymous as possible. (Id.). It is so powerful that it can circumvent Chinese censors.

The reference to “Chinese censors” is a bit weird, but okay, point made: if people don’t want to appear as if they’re from Indiana, they can do so.

The court also realizes that just blocking adult content websites won’t block access to other sources of porn. The ruling probably violates a bunch of proposed laws against content that is “harmful to minors” by telling kids how to find porn:

Other workarounds include torrents, where someone can connect directly to another computer—rather than interacting with a website—to download pornography. (Id. ¶ 29). As before, this is free. (Id.). Minors could also just search terms like “hot sex” on search engines like Bing or Google without verifying their age. (Id. ¶ 32–33). While these engines automatically blur content to start, (Glogoza Decl. ¶¶ 5–6), users can simply click a button turning off “safe search” to reveal pornographic images, (Sonnier Decl. ¶ 32). Or a minor could make use of mixed content websites below the 1/3 mark like Reddit and Facebook

And thus, problem number one with age verification: it’s not going to be even remotely effective for achieving the policy goals being sought here.

With this background, it is easy to see why age verification requirements are ineffective at preventing minors from viewing obscene content. (See id. ¶¶ 14–34 (discussing all the ways minors could bypass age verification requirements)). The Attorney General submits no evidence suggesting that age verification is effective at preventing minors from accessing obscene content; one source submitted by the Attorney General suggests there must be an “investigation” into the effectiveness of preventive methods, “such as age verification tools.

And that matters. Again, even if you agree with the policy goals, you should recognize that putting in place an ineffective regulatory regime that is easily bypassed is not at all helpful, especially given that it might also restrict speech for non-minors.

Unlike the 5th Circuit, this district court in Indiana understands the precedents related to this issue and knows that Ashcroft v. ACLU already dealt with the main issue at play in this case:

In the case most like the one here, the Supreme Court affirmed the preliminary enjoinment of the Child Online Protection Act. See Ashcroft II, 542 U.S. at 660–61. That statute imposed penalties on websites that posted content that was “harmful to minors” for “commercial purposes” unless those websites “requir[ed the] use of a credit card” or “any other reasonable measures that are feasible under available technology” to restrict the prohibited materials to adults. 47 U.S.C. § 231(a)(1). The Supreme Court noted that such a scheme failed to clear the applicable strict scrutiny bar. Ashcroft II, 542 U.S. at 665–66 (applying strict scrutiny test). That was because the regulations were not particularly effective as it was easy for minors to get around the requirements, id. at 667– 68, and failed to consider less restrictive alternatives that would have been equally effective such as filtering and blocking software, id. at 668–69 (discussing filtering and blocking software). All of that is equally true here, which is sufficient to resolve this case against the Attorney General.

Indiana’s Attorney General points to the 5th Circuit ruling that tries to ignore Ashcroft, but the judge here is too smart for that. He knows he’s bound by the Supreme Court, not whatever version of Calvinball the 5th Circuit is playing:

Instead of applying strict scrutiny as directed by the Supreme Court, the Fifth Circuit applied rational basis scrutiny under Ginsberg v. New York, 390 U.S. 629 (1968), even though the Supreme Court explained how Ginsberg was inapplicable to these types of cases in Reno, 521 U.S. at 865–66. The Attorney General argues this court should follow that analysis and apply rational basis scrutiny under Ginsberg.

However, this court is bound by Ashcroft II. See Agostini v. Felton, 521 U.S. 203, 237–38 (1997) (explaining lower courts “should follow the case which directly controls”). To be sure, Ashcroft II involved using credit cards, and Indiana’s statute requires using a driver’s license or third-party identification software.10 But as discussed below, this is not sufficient to take the Act beyond the strictures of strict scrutiny, nor enough to materially advance Indiana’s compelling interest, nor adequate to tailor the Act to the least restrictive means.

And thus, strict scrutiny must apply, unlike in the 5th Circuit, and this law can’t pass that bar.

Among other things, the age verification in this law doesn’t just apply to material that is obscene to minors:

The age verification requirements do not just apply to obscene content and also burden a significant amount of protected speech for two reasons. First, Indiana’s statute slips from the constitutional definition of obscenity and covers more material than considered by the Miller test. This issue occurs with the third prong of Indiana’s “material harmful to minors” definition, where it describes the harmful material as “patently offensive” based on “what is suitable matter for . . . minors.” Ind. Code § 35- 49-2-2. It is well established that what may be acceptable for adults may still be deleterious (and subject to restriction) to minors. Ginsberg, 390 U.S. at 637 (holding that minors “have a more restricted right than that assured to adults to judge and determine for themselves what sex material they may read or see”); cf. ACLU v. Ashcroft, 322 F.3d 240, 268 (3d Cir. 2003) (explaining the offensiveness of materials to minors changes based on their age such that “sex education materials may have ‘serious value’ for . . . sixteen-year-olds” but be “without ‘serious value’ for children aged, say, ten to thirteen”), aff’d sub nom. in relevant part, 542 U.S. 656 (2004). Put differently, materials unsuitable for minors may not be obscene under the strictures of Miller, meaning the statute places burdens on speech that is constitutionally protected but not appropriate for children

Also, even if the government has a compelling interest in protecting kids from adult content, this law doesn’t actually do a good job of that:

To be sure, protecting minors from viewing obscene material is a compelling interest; the Act just fails to further that interest in the constitutionally required way because it is wildly underinclusive when judged against that interest. “[A] law cannot be regarded as protecting an interest ‘of the highest order’ . . . when it leaves appreciable damage to that supposedly vital interest unprohibited.” …

The court makes it clear how feeble this law is:

To Indiana’s legislature, the materials harmful to minors are not so rugged that the State believes they should be unavailable to adults, nor so mentally debilitating to a child’s mind that they should be completely inaccessible to children. The Act does not function as a blanket ban of these materials, nor ban minors from accessing these materials, nor impose identification requirements on everybody displaying obscene content. Instead, it only circumscribes the conduct of websites who have a critical mass of adult material, whether they are currently displaying that content to a minor or not. Indeed, minors can freely access obscene material simply by searching that material in a search engine and turning off the blur feature. (Id. ¶¶ 31–33). Indiana’s legislature is perfectly willing “to leave this dangerous, mind-altering material in the hands of children” so long as the children receive that content from Google, Bing, any newspaper, Facebook, Reddit, or the multitude of other websites not covered.

The court also points out how silly it is that the law only applies to sites with a high enough threshold (33%) of adult content. If the goal is to block kids’ access to porn, that’s a stupid way to go about it. Indeed, the court effectively notes that a website could get around the ban just by adding a bunch of non-adult imagery content.

The Attorney General has not even attempted to meet its burden to explain why this speaker discrimination is necessary to or supportive of to its compelling interest; why is it that a website that contains 32% pornographic material is not as deleterious to a minor as a website that contains 33% pornographic material? And why does publishing news allow a website to display as many adult-images as it desires without needing to verify the user is an adult? Indeed, the Attorney General has not submitted any evidence suggesting age verification would prohibit a single minor from viewing harmful materials, even though he bears the burden of demonstrating the effectiveness of the statute. Ultimately, the Act favors certain speakers over others by selectively imposing the age verification burdens. “This the State cannot do.” Sorrell v. IMS Health Inc., 564 U.S. 552, 580 (2011). The Act is likely unconstitutional.

In a footnote, the judge highlights an even dumber part of the law: that the 33% is based on the percentage of imagery, and gives a hypothetical of a site that would be required to age gate:

Consider a blog that discusses new legislation the author would like to see passed. It contains hundreds of posts discussing these proposals. The blog does not include images save one exception: attached to a proposal suggesting the legislature should provide better sexual health resources to adult-entertainment performers is a picture of an adult-entertainer striking a raunchy pose. Even though 99% of the blog is core political speech, adults would be unable to access the website unless they provide identification because the age verification provisions do not trigger based on the amount of total adult content on the website, but rather based on the percentage of images (no matter how much text content there is) that contain material harmful to minors.

The court suggests some alternatives to this law, from requiring age verification for accessing any adult content (though, it notes that’s also probably unconstitutional, even if it’s less restrictive) to having the state offer up free filtering and blocking tech for parents to make use of for their kids:

Indiana could make freely available and/or require the use of filtering and blocking technology on minors’ devices. This is a superior alternative. (Sonnier Decl. ¶ 47 (“Internet content filtering is a superior alternative to Internet age verification.”); see also Allen Decl. ¶¶ 38–39 (not disputing that content filtering is superior to age verification as “[t]he Plaintiff’s claim makes a number of correct positive assertions about content filtering technology” but noting “[t]here is no reason why both content filtering and age verification could not be deployed either consecutively or concurrently”)). That is true for the reasons discussed in the background section: filtering and blocking software is more accurate in identifying and blocking adult content, more difficult to circumvent, allows parents a place to participate in the rearing of their children, and imposes fewer costs on third-party websites.

And thus, due to the fact that the law is pretty obviously unconstitutional, the judge grants the injunction, blocking the law from going into effect. Indiana will almost certainly appeal and we’ll have to just keep going through this nonsense over and over again.

Thankfully, Indiana is in the 7th Circuit, not the 5th, so there’s at least somewhat less of a chance for pure nuttery on appeal.