HTPA legislation is a campaign to block access to certain categories of online content by forcing manufacturers, distributors and retailers to install and activate filtering software on any device that allows access to the internet. The bills are difficult to parse because they are vague, opaque and internally inconsistent. Each state is similar but slightly different. Currently, there are 14 HTPA bills in 12 states. The goal of the campaign is to pass the legislation in all the 50 states. The title of the legislation is misleading as the legislation only addresses human trafficking indirectly.
Generally, the legislation bars any company from doing business in a state if they manufacture, produce or sell any device in the state that provides internet access unless it contains an “active and operating digital blocking” capability that blocks obscene for minors and obscene material. The software must also make inaccessible “private sexual images published without consent,” any “hub that facilitates prostitution” and any “website that facilitate human trafficking.” These terms are often not defined.
Additional Requirements for manufacturers, distributors and retailers:
- Sending out regular updates to “ensure the quality and performance of the filter.
- Establishing a website or call center so consumers can report obscene or obscene for minors’ material that was not blocked by the filter.
- Within five days of the report being filed, the company must assess if the material should have been blocked. If so, they must send an update of the software.
- Allowing consumers to report non-obscene or non-obscene for minors’ material that was incorrectly blocked. The company must unblock it if it does not meet the definition of prohibited speech.
- If the company fails to do so in a timely fashion, the consumer can bring a cause of action.
The filters may also be deactivated if:
- The consumer specifically requests in writing that it be deactivated.
- The business verifies that the consumer is an adult in a face to face meeting with the consumer in person or through electronic means.
- The customer receives a written “warning” of the potential danger of deactivating the filter; and, the consumer pays a $20 tax, which must be remitted to the state by the business.
Distribution of a device without a filter that blocks obscene or obscene for minors’ material is subject to a prison sentence and a fine.
The state attorney general can seek an injunction to bar a company from doing business in the state it if fails to comply with any provision in the legislation.
A consumer can sue for damages if he or she reported an instance of failing to block obscene or obscene for minors’ material and it was not subsequently blocked. They can also recover legal fees. A consumer can sue for “judicial relief” if content is blocked that is not obscene or obscene for minors. The consumer may or may not be able to recover legal fees.
HTPA Campaign Background
The main organizer of the HTPA campaign is Chris Sevier. In a recent article in The Daily Beast, he said he helped draft the state bills and he is meeting people in Washington D.C. in an attempt to get a bill introduced in Congress. The story describes several lawsuits brought by Sevier against Apple, Google, Microsoft and many other companies that make internet devices to force them to install and activate filtering software on their devices. So far, he has not succeeded in court. The story also looks at his checkered past including the suspension of his law license being suspended and brushes with the law.
HTPA Legislation is Unconstitutional and overly burdensome for device makers and internet users
HTPA legislation forces producers and distributors of devices that allow access to the internet to violate the First Amendment rights of content providers and consumers. It places an unreasonable burden on businesses that make and sell devices to comply with the legislation. It imposes unreasonable obstacles on adults who want to use the internet without the government or hardware companies limiting what speech they can access.
The legislation violates the First Amendment for multiple reasons:
- Mandatory filters are unconstitutional: In Ashcroft v. American Civil Liberties Union, the Supreme Court discussed filtering software to block sexual content. The Court assumed that the government could not impose filtering software in a discussion of whether voluntary use of filtering software is a least restrictive alternative to a law criminalizing online speech. 542 U.S. 656, 669 (2004).
- Voluntary filters are a less restrictive means: In Ashcroft v. American Civil Liberties Union, the Supreme Court held that voluntary filters are the best solution to blocking access to illegal or unwanted material since the end user can decide what is appropriate for themselves or their children without censoring speech for others.
- Overbreadth: The legislation forces companies to block access to speech that is protected by the First Amendment for adults to read or see even if it may be illegal to post the speech. Similarly, the government cannot mandate blocking an entire website because something posted on it might be illegal speech.
- Vagueness: The legislation does not define terms that determine what content must be filtered such as “hubs for prostitution,” “known to facilitate human trafficking” or “reasonably proactive.” Grayned v. City of Rockford, 408 U.S. 104, 108 (1972).
- Taxation of speech: The $20 fee for deactivation of each device is an unconstitutional content-based taxation on speech. “[O]fficial scrutiny of the content of publications as the basis for imposing a tax is entirely incompatible with the First Amendment’s guarantee of freedom of the press.” Arkansas Writer’s Project, Inc. v. Ragland, 481 U.S. 221, 230 (1987).
- Compelled Speech: It is unconstitutional to force a maker or seller of devices to provide a written warning about the dangers of seeing sexually explicit speech to anyone who wants to disable the filtering software. Generally, “freedom of speech prohibits the government from telling people what they must say.” Rumsfeld v. Forum for Academic and Institutional Rights, Inc., 547 U.S. 47, 61 (2006).
It imposes unreasonable burden on makers and sellers of products that must be filtered:
- It is impossible to develop software that blocks speech that is illegal without blocking material that is legal. Software is not sophisticated enough to assess the community standards where the device is located, determine what materials have serious value and judge the intent of the publisher. Filtering software will block far more material than is illegal. At the same time, it will fail to block material that is illegal, especially text.
- The creation and maintenance of filtering software will be costly and time consuming. Companies must either develop or license filtering software to be installed on thousands of devices. The software must be updated in perpetuity and it has to vary from state to state to adjust to the community standards in the test for obscenity and obscenity for minors.
- It is very expensive to set up a website or call center to take reports that the filter has not blocked illegal content or has blocked legal content. The business must hire staff who can review and make a legal assessment for hundreds or thousands of reports.
- This legislation will invite frivolous lawsuits. Businesses will be sued for failing to install and activate filtering software; for over filtering or under filtering; for failing to review complaints quickly enough; and failing to send daily software updates.
- The elaborate deactivation process will be difficult to develop and cumbersome to maintain. Businesses must develop a way to allow adult consumers to have a “face to face encounter” to prove they are 18 years old or older, even if they have no retail business in the state. They must collect a special tax. They must keep records of evidence that a person is an adult. They must supply written materials on the harms of deactivating.
- Some customers will opt to buy a device in another state to avoid the cost or stigma of the deactivation process.
- The legislation puts the responsibility on the businesses to prevent users from accessing illicit content rather than putting personal accountability on the person using the internet.
It places an unreasonable burden on consumers:
- The legislation is a heavy tax on consumers. A single person may have to pay three or four deactivation fees (computer, modem, router, Wi-Fi device) to be able to access the internet without filters on a single computer. A family with multiple devices might have to pay hundreds of dollars to turn off the filtering software.
- The deactivation process is burdensome and time-consuming for the consumer. He or she must make a written request, go to a face-to-face interview, be subjected to a lecture about the dangers of material on the internet and pay a $20 tax.
- The consumer may have to go through this process numerous times with makers and sellers of devices before getting unfiltered access to the internet.
- The consumer will be forced to receive a lecture approved by the state about the risks of deactivating the filters.
Current bills (click the link to see a summary of the bill and its text:
Alabama H.B. 428 – Referred to Commerce and Small Business Committee 3/16/2017
Georgia H.B. 509 – House second read on 3/1/2017 (was scheduled for a hearing but removed from agenda)
Indiana H.B. 1533 – Referred to Courts and Criminal Codes 1/18/2017
Louisiana H.B. 172 – Referred to Commerce Committee 3/28/2017
North Dakota H.B. 1185 – Withdrawn from further consideration 1/11/2017
Oklahoma H.B. 1472 – Amended and passed Judiciary and Criminal Justice and Corrections Committee 3/1/2017.
South Carolina H.B. 3003 – Tabled prior to hearing in Judiciary Committee 2/8/2017 (Scheduled for hearing but pulled by sponsor)
Texas H.B. 2266 – Referred to Business and Industry Committee 3/15/2017
West Virginia S.B. 447 – Referred to Judiciary Committee 2/27/2017
Wyoming H.B. 245 – Died in Appropriations Committee 2/3/2017