In the Senate of the United States,
July 30, 2024.
Resolved, That the Senate agree to the amendment of the House of Representatives to the bill (S. 2073) “An Act to amend title 31, United States Code, to require agencies to include a list of outdated or duplicative reporting requirements in annual budget justifications, and for other purposes.”, with the following
SENATE AMENDMENTS TO HOUSE AMENDMENT:
(b) Table of contents.—The table of contents for this Act is as follows:
Sec. 1. Short title; table of contents.
Sec. 101. Definitions.
Sec. 102. Duty of care.
Sec. 103. Safeguards for minors.
Sec. 104. Disclosure.
Sec. 105. Transparency.
Sec. 106. Research on social media and minors.
Sec. 107. Market research.
Sec. 108. Age verification study and report.
Sec. 109. Guidance.
Sec. 110. Enforcement.
Sec. 111. Kids online safety council.
Sec. 112. Effective date.
Sec. 113. Rules of construction and other matters.
Sec. 120. Definitions.
Sec. 121. Requirement to allow users to see unmanipulated content on internet platforms.
Sec. 130. Relationship to State laws.
Sec. 131. Severability.
Sec. 201. Online collection, use, disclosure, and deletion of personal information of children and teens.
Sec. 202. Study and reports of mobile and online application oversight and enforcement.
Sec. 203. GAO study.
Sec. 204. Severability.
TITLE III—ELIMINATING USELESS REPORTS
Sec. 301. Sunsets for agency reports.
In this subtitle:
(2) COMPULSIVE USAGE.—The term “compulsive usage” means any response stimulated by external factors that causes an individual to engage in repetitive behavior reasonably likely to cause psychological distress.
(3) COVERED PLATFORM.—
(A) IN GENERAL.—The term “covered platform” means an online platform, online video game, messaging application, or video streaming service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.
(B) EXCEPTIONS.—The term “covered platform” does not include—
(i) an entity acting in its capacity as a provider of—
(I) a common carrier service subject to the Communications Act of 1934 (47 U.S.C. 151 et seq.) and all Acts amendatory thereof and supplementary thereto;
(II) a broadband internet access service (as such term is defined for purposes of section 8.1(b) of title 47, Code of Federal Regulations, or any successor regulation);
(IV) a teleconferencing or video conferencing service that allows reception and transmission of audio or video signals for real-time communication, provided that—
(V) a wireless messaging service, including such a service provided through short messaging service or multimedia messaging service protocols, that is not a component of, or linked to, an online platform and where the predominant or exclusive function is direct messaging consisting of the transmission of text, photos or videos that are sent by electronic means, where messages are transmitted from the sender to a recipient, and are not posted within an online platform or publicly;
(iii) any public or private preschool, elementary, or secondary school, or any institution of vocational, professional, or higher education;
(iv) a library (as defined in section 213(1) of the Library Services and Technology Act (20 U.S.C. 9122(1)));
(v) a news or sports coverage website or app where—
(4) DESIGN FEATURE.—The term “design feature” means any feature or component of a covered platform that will encourage or increase the frequency, time spent, or activity of minors on the covered platform. Design features include but are not limited to—
(5) GEOLOCATION.—The term “geolocation” has the meaning given the term “geolocation information” in section 1302 of the Children's Online Privacy Protection Act of 1998 (15 U.S.C. 6501), as added by section 201(a).
(6) KNOW OR KNOWS.—The term “know” or “knows” means to have actual knowledge or knowledge fairly implied on the basis of objective circumstances.
(7) MENTAL HEALTH DISORDER.—The term “mental health disorder” has the meaning given the term “mental disorder” in the Diagnostic and Statistical Manual of Mental Health Disorders, 5th Edition (or the most current successor edition).
(8) MICROTRANSACTION.—
(A) IN GENERAL.—The term “microtransaction” means a purchase made in an online video game (including a purchase made using a virtual currency that is purchasable or redeemable using cash or credit or that is included as part of a paid subscription service).
(B) INCLUSIONS.—Such term includes a purchase involving surprise mechanics, new characters, or in-game items.
(10) ONLINE PLATFORM.—The term “online platform” means any public-facing website, online service, online application, or mobile application that predominantly provides a community forum for user generated content, such as sharing videos, images, games, audio files, or other content, including a social media service, social network, or virtual reality environment.
(11) ONLINE VIDEO GAME.—The term “online video game” means a video game, including an educational video game, that connects to the internet and that allows a user to—
(12) PARENT.—The term “parent” has the meaning given that term in section 1302 of the Children's Online Privacy Protection Act (15 U.S.C. 6501).
(13) PERSONAL DATA.—The term “personal data” has the same meaning as the term “personal information” as defined in section 1302 of the Children's Online Privacy Protection Act (15 U.S.C. 6501).
(14) PERSONALIZED RECOMMENDATION SYSTEM.—The term “personalized recommendation system” means a fully or partially automated system used to suggest, promote, or rank content, including other users, hashtags, or posts, based on the personal data of users. A recommendation system that suggests, promotes, or ranks content based solely on the user’s language, city or town, or age shall not be considered a personalized recommendation system.
(15) SEXUAL EXPLOITATION AND ABUSE.—The term “sexual exploitation and abuse” means any of the following:
(B) Child sexual abuse material, as described in sections 2251, 2252, 2252A, and 2260 of title 18, United States Code.
(a) Prevention of harm to minors.—A covered platform shall exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate the following harms to minors:
(1) Consistent with evidence-informed medical information, the following mental health disorders: anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.
(5) Promotion and marketing of narcotic drugs (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, gambling, or alcohol.
(a) Safeguards for minors.—
(1) SAFEGUARDS.—A covered platform shall provide a user or visitor that the covered platform knows is a minor with readily-accessible and easy-to-use safeguards to, as applicable—
(B) prevent other users or visitors, whether registered or not, from viewing the minor’s personal data collected by or shared on the covered platform, in particular restricting public access to personal data;
(C) limit design features that encourage or increase the frequency, time spent, or activity of minors on the covered platform, such as infinite scrolling, auto playing, rewards for time spent on the platform, notifications, and other design features that result in compulsive usage of the covered platform by the minor;
(2) OPTION.—A covered platform shall provide a user that the covered platform knows is a minor with a readily-accessible and easy-to-use option to limit the amount of time spent by the minor on the covered platform.
(3) DEFAULT SAFEGUARD SETTINGS FOR MINORS.—A covered platform shall provide that, in the case of a user or visitor that the platform knows is a minor, the default setting for any safeguard described under paragraph (1) shall be the option available on the platform that provides the most protective level of control that is offered by the platform over privacy and safety for that user or visitor.
(b) Parental tools.—
(1) TOOLS.—A covered platform shall provide readily-accessible and easy-to-use settings for parents to support a user that the platform knows is a minor with respect to the user's use of the platform.
(2) REQUIREMENTS.—The parental tools provided by a covered platform shall include—
(A) the ability to manage a minor’s privacy and account settings, including the safeguards and options established under subsection (a), in a manner that allows parents to—
(3) NOTICE TO MINORS.—A covered platform shall provide clear and conspicuous notice to a user when the tools described in this subsection are in effect and what settings or controls have been applied.
(4) DEFAULT TOOLS.—A covered platform shall provide that, in the case of a user that the platform knows is a child, the tools required under paragraph (1) shall be enabled by default.
(5) APPLICATION TO EXISTING ACCOUNTS.—If, prior to the effective date of this subsection, a covered platform provided a parent of a user that the platform knows is a child with notice and the ability to enable the parental tools described under this subsection in a manner that would otherwise comply with this subsection, and the parent opted out of enabling such tools, the covered platform is not required to enable such tools with respect to such user by default when this subsection takes effect.
(c) Reporting mechanism.—
(1) REPORTS SUBMITTED BY PARENTS, MINORS, AND SCHOOLS.—A covered platform shall provide—
(2) TIMING.—A covered platform shall establish an internal process to receive and substantively respond to such reports in a reasonable and timely manner, but in no case later than—
(A) 10 days after the receipt of a report, if, for the most recent calendar year, the platform averaged more than 10,000,000 active users on a monthly basis in the United States;
(d) Advertising of illegal products.—A covered platform shall not facilitate the advertising of narcotic drugs (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, gambling, or alcohol to an individual that the covered platform knows is a minor.
(e) Rules of application.—
(1) ACCESSIBILITY.—With respect to safeguards and parental tools described under subsections (a) and (b), a covered platform shall provide—
(A) information and control options in a clear and conspicuous manner that takes into consideration the differing ages, capacities, and developmental needs of the minors most likely to access the covered platform and does not encourage minors or parents to weaken or disable safeguards or parental tools;
(2) DARK PATTERNS PROHIBITION.—It shall be unlawful for any covered platform to design, modify, or manipulate a user interface of a covered platform with the purpose or substantial effect of subverting or impairing user autonomy, decision-making, or choice with respect to safeguards or parental tools required under this section.
(3) TIMING CONSIDERATIONS.—
(4) RULES OF CONSTRUCTION.—Nothing in this section shall be construed to—
(A) prevent a covered platform from taking reasonable measures to—
(B) require the disclosure of a minor's browsing behavior, search history, messages, contact list, or other content or metadata of their communications;
(f) Device or console controls.—
(1) IN GENERAL.—Nothing in this section shall be construed to prohibit a covered platform from integrating its products or service with, or duplicate controls or tools provided by, third-party systems, including operating systems or gaming consoles, to meet the requirements imposed under subsections (a) and (b) relating to safeguards for minors and parental tools, provided that—
(2) PRESERVATION OF PROTECTIONS.—In the event of a conflict between the controls or tools of a third-party system, including operating systems or gaming consoles, and a covered platform, the covered platform is not required to override the controls or tools of a third-party system if it would undermine the protections for minors from the safeguards or parental tools imposed under subsections (a) and (b).
(a) Notice.—
(1) REGISTRATION OR PURCHASE.—Prior to registration or purchase of a covered platform by an individual that the platform knows is a minor, the platform shall provide clear, conspicuous, and easy-to-understand—
(A) notice of the policies and practices of the covered platform with respect to safeguards for minors required under section 103;
(2) NOTIFICATION.—
(A) NOTICE AND ACKNOWLEDGMENT.—In the case of an individual that a covered platform knows is a child, the platform shall additionally provide information about the parental tools and safeguards required under section 103 to a parent of the child and obtain verifiable consent (as defined in section 1302(9) of the Children's Online Privacy Protection Act (15 U.S.C. 6501(9))) from the parent prior to the initial use of the covered platform by the child.
(B) REASONABLE EFFORT.—A covered platform shall be deemed to have satisfied the requirement described in subparagraph (A) if the covered platform is in compliance with the requirements of the Children's Online Privacy Protection Act (15 U.S.C. 6501 et seq.) to use reasonable efforts (taking into consideration available technology) to provide a parent with the information described in subparagraph (A) and to obtain verifiable consent as required.
(3) CONSOLIDATED NOTICES.—For purposes of this subtitle, a covered platform may consolidate the process for providing information under this subsection and obtaining verifiable consent or the consent of the minor involved (as applicable) as required under this subsection with its obligations to provide relevant notice and obtain verifiable consent under the Children's Online Privacy Protection Act (15 U.S.C. 6501 et seq.).
(b) Personalized recommendation system.—A covered platform that operates a personalized recommendation system shall set out in its terms and conditions, in a clear, conspicuous, and easy-to-understand manner—
(c) Advertising and marketing information and labels.—
(d) Resources for parents and minors.—A covered platform shall provide to minors and parents clear, conspicuous, easy-to-understand, and comprehensive information in a prominent location, which may include a link to a web page, regarding—
(a) In general.—Subject to subsection (b), not less frequently than once a year, a covered platform shall issue a public report describing the reasonably foreseeable risks of harms to minors and assessing the prevention and mitigation measures taken to address such risk based on an independent, third-party audit conducted through reasonable inspection of the covered platform.
(b) Scope of application.—The requirements of this section shall apply to a covered platform if—
(1) for the most recent calendar year, the platform averaged more than 10,000,000 active users on a monthly basis in the United States; and
(2) the platform predominantly provides a community forum for user-generated content and discussion, including sharing videos, images, games, audio files, discussion in a virtual setting, or other content, such as acting as a social media platform, virtual reality environment, or a social network service.
(c) Content.—
(1) TRANSPARENCY.—The public reports required of a covered platform under this section shall include—
(C) an accounting, based on the data held by the covered platform, of—
(i) the number of users using the covered platform that the platform knows to be minors in the United States;
(D) an accounting of total reports received regarding, and the prevalence (which can be based on scientifically valid sampling methods using the content available to the covered platform in the normal course of business) of content related to, the harms described in section 102(a), disaggregated by category of harm and language, including English and the top 5 non-English languages used by users accessing the platform from the United States (as identified under subparagraph (C)(iii)); and
(2) REASONABLY FORESEEABLE RISK OF HARM TO MINORS.—The public reports required of a covered platform under this section shall include—
(A) an assessment of the reasonably foreseeable risk of harms to minors posed by the covered platform, specifically identifying those physical, mental, developmental, or financial harms described in section 102(a);
(B) a description of whether and how the covered platform uses design features that encourage or increase the frequency, time spent, or activity of minors on the covered platform, such as infinite scrolling, auto playing, rewards for time spent on the platform, notifications, and other design features that result in compulsive usage of the covered platform by the minor;
(C) a description of whether, how, and for what purpose the platform collects or processes categories of personal data that may cause reasonably foreseeable risk of harms to minors;
(D) an evaluation of the efficacy of safeguards for minors and parental tools under section 103, and any issues in delivering such safeguards and the associated parental tools;
(3) MITIGATION.—The public reports required of a covered platform under this section shall include, for English and the top 5 non-English languages used by users accessing the platform from the United States (as identified under paragraph (2)(C)(iii)))—
(A) a description of the safeguards and parental tools available to minors and parents on the covered platform;
(B) a description of interventions by the covered platform when it had or has reason to believe that harms to minors could occur;
(C) a description of the prevention and mitigation measures intended to be taken in response to the known and emerging risks identified in its assessment of reasonably foreseeable risks of harms to minors, including steps taken to—
(D) a description of internal processes for handling reports and automated detection mechanisms for harms to minors, including the rate, timeliness, and effectiveness of responses under the requirement of section 103(c);
(d) Reasonable inspection.—In conducting an inspection of the reasonably foreseeable risk of harm to minors under this section, an independent, third-party auditor shall—
(2) consult parents and youth experts, including youth and families with relevant past or current experience, public health and mental health nonprofit organizations, health and development organizations, and civil society with respect to the prevention of harms to minors;
(3) conduct research based on experiences of minors that use the covered platform, including reports under section 103(c) and information provided by law enforcement;
(4) take account of research, including research regarding design features, marketing, or product integrity, industry best practices, or outside research;
(e) Cooperation with independent, third-party audit.—To facilitate the report required by subsection (c), a covered platform shall—
(1) provide or otherwise make available to the independent third-party conducting the audit all information and material in its possession, custody, or control that is relevant to the audit;
(f) Privacy safeguards.—
(1) IN GENERAL.—In issuing the public reports required under this section, a covered platform shall take steps to safeguard the privacy of its users, including ensuring that data is presented in a de-identified, aggregated format such that it is not reasonably linkable to any user.
(b) Research on social media harms.—Not later than 12 months after the date of enactment of this Act, the Commission shall seek to enter into a contract with the National Academy, under which the National Academy shall conduct no less than 5 scientific, comprehensive studies and reports on the risk of harms to minors by use of social media and other online platforms, including in English and non-English languages.
(c) Matters to be addressed.—In contracting with the National Academy, the Commission, in consultation with the Secretary, shall seek to commission separate studies and reports, using the Commission's authority under section 6(b) of the Federal Trade Commission Act (15 U.S.C. 46(b)), on the relationship between social media and other online platforms as defined in this subtitle on the following matters:
(d) Additional study.—Not earlier than 4 years after enactment, the Commission shall seek to enter into a contract with the National Academy under which the National Academy shall conduct an additional study and report covering the matters described in subsection (c) for the purposes of providing additional information, considering new research, and other matters.
(e) Content of reports.— The comprehensive studies and reports conducted pursuant to this section shall seek to evaluate impacts and advance understanding, knowledge, and remedies regarding the harms to minors posed by social media and other online platforms, and may include recommendations related to public policy.
(f) Active studies.—If the National Academy is engaged in any active studies on the matters described in subsection (c) at the time that it enters into a contract with the Commission to conduct a study under this section, it may base the study to be conducted under this section on the active study, so long as it otherwise incorporates the requirements of this section.
(g) Collaboration.—In designing and conducting the studies under this section, the Commission, the Secretary, and the National Academy shall consult with the Surgeon General and the Kids Online Safety Council.
(h) Access to Data.—
(1) FACT-FINDING AUTHORITY.—The Commission may issue orders under section 6(b) of the Federal Trade Commission Act (15 U.S.C. 46(b)) to require covered platforms to provide reports, data, or answers in writing as necessary to conduct the studies required under this section.
(2) SCOPE.—In exercising its authority under paragraph (1), the Commission may issue orders to no more than 5 covered platforms per study under this section.
(3) CONFIDENTIAL ACCESS.—Notwithstanding section 6(f) or 21 of the Federal Trade Commission Act (15 U.S.C. 46, 57b–2), the Commission shall enter in agreements with the National Academy to share appropriate information received from a covered platform pursuant to an order under such subsection (b) for a comprehensive study under this section in a confidential and secure manner, and to prohibit the disclosure or sharing of such information by the National Academy. Nothing in this paragraph shall be construed to preclude the disclosure of any such information if authorized or required by any other law.
(a) Market research by covered platforms.—The Federal Trade Commission, in consultation with the Secretary of Commerce, shall issue guidance for covered platforms seeking to conduct market- and product-focused research on minors. Such guidance shall include—
(1) a standard consent form that provides minors and their parents a clear, conspicuous, and easy-to-understand explanation of the scope and purpose of the research to be conducted that is available in English and the top 5 non-English languages used in the United States;
(a) Study.—The Secretary of Commerce, in coordination with the Federal Communications Commission and the Federal Trade Commission, shall conduct a study evaluating the most technologically feasible methods and options for developing systems to verify age at the device or operating system level.
(b) Contents.—Such study shall consider —
(3) the accuracy of such systems and their impact or steps to improve accessibility, including for individuals with disabilities;
(4) how such a system or systems could verify age while mitigating risks to user privacy and data security and safeguarding minors' personal data, emphasizing minimizing the amount of data collected and processed by covered platforms and age verification providers for such a system;
(c) Report.—Not later than 1 year after the date of enactment of this Act, the agencies described in subsection (a) shall submit a report containing the results of the study conducted under such subsection to the Committee on Commerce, Science, and Transportation of the Senate and the Committee on Energy and Commerce of the House of Representatives.
(a) In general.—Not later than 18 months after the date of enactment of this Act, the Federal Trade Commission, in consultation with the Kids Online Safety Council established under section 111, shall issue guidance to—
(1) provide information and examples for covered platforms and auditors regarding the following, with consideration given to differences across English and non-English languages—
(A) identifying design features that encourage or increase the frequency, time spent, or activity of minors on the covered platform;
(C) best practices in providing minors and parents the most protective level of control over privacy and safety;
(2) outline conduct that does not have the purpose or substantial effect of subverting or impairing user autonomy, decision-making, or choice, or of causing, increasing, or encouraging compulsive usage for a minor, such as—
(b) Guidance on knowledge standard.—Not later than 18 months after the date of enactment of this Act, the Federal Trade Commission shall issue guidance to provide information, including best practices and examples, for covered platforms to understand how the Commission would determine whether a covered platform “had knowledge fairly implied on the basis of objective circumstances” for purposes of this subtitle.
(c) Limitation on Federal Trade Commission guidance.—
(1) EFFECT OF GUIDANCE.—No guidance issued by the Federal Trade Commission with respect to this subtitle shall—
(2) USE IN ENFORCEMENT ACTIONS.—In any enforcement action brought pursuant to this subtitle, the Federal Trade Commission or a State attorney general, as applicable—
(B) may not base such enforcement action on, or execute a consent order based on, practices that are alleged to be inconsistent with guidance issued by the Federal Trade Commission with respect to this subtitle, unless the practices are alleged to violate a provision of this subtitle.
For purposes of enforcing this subtitle, State attorneys general shall take into account any guidance issued by the Commission under subsection (b).
(a) Enforcement by Federal Trade Commission.—
(1) UNFAIR AND DECEPTIVE ACTS OR PRACTICES.—A violation of this subtitle shall be treated as a violation of a rule defining an unfair or deceptive act or practice prescribed under section 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)).
(2) POWERS OF THE COMMISSION.—
(A) IN GENERAL.—The Federal Trade Commission (referred to in this section as the “Commission”) shall enforce this subtitle in the same manner, by the same means, and with the same jurisdiction, powers, and duties as though all applicable terms and provisions of the Federal Trade Commission Act (15 U.S.C. 41 et seq.) were incorporated into and made a part of this subtitle.
(B) PRIVILEGES AND IMMUNITIES.—Any person that violates this subtitle shall be subject to the penalties, and entitled to the privileges and immunities, provided in the Federal Trade Commission Act (15 U.S.C. 41 et seq.).
(b) Enforcement by State attorneys general.—
(1) IN GENERAL.—
(A) CIVIL ACTIONS.—In any case in which the attorney general of a State has reason to believe that a covered platform has violated or is violating section 103, 104, or 105, the State, as parens patriae, may bring a civil action on behalf of the residents of the State in a district court of the United States or a State court of appropriate jurisdiction to—
(B) NOTICE.—
(i) IN GENERAL.—Before filing an action under subparagraph (A), the attorney general of the State involved shall provide to the Commission—
(ii) EXEMPTION.—
(2) INTERVENTION.—
(3) CONSTRUCTION.—For purposes of bringing any civil action under paragraph (1), nothing in this subtitle shall be construed to prevent an attorney general of a State from exercising the powers conferred on the attorney general by the laws of that State to—
(4) ACTIONS BY THE COMMISSION.—In any case in which an action is instituted by or on behalf of the Commission for violation of this subtitle, no State may, during the pendency of that action, institute a separate action under paragraph (1) against any defendant named in the complaint in the action instituted by or on behalf of the Commission for that violation.
(a) Establishment.—Not later than 180 days after the date of enactment of this Act, the Secretary of Commerce shall establish and convene the Kids Online Safety Council for the purpose of providing advice on matters related to this subtitle.
(b) Participation.—The Kids Online Safety Council shall include diverse participation from—
(1) academic experts, health professionals, and members of civil society with expertise in mental health, substance use disorders, and the prevention of harms to minors;
(2) representatives in academia and civil society with specific expertise in privacy, free expression, access to information, and civil liberties;
(5) representatives of the National Telecommunications and Information Administration, the National Institute of Standards and Technology, the Federal Trade Commission, the Department of Justice, and the Department of Health and Human Services;
(8) representatives of communities of socially disadvantaged individuals (as defined in section 8 of the Small Business Act (15 U.S.C. 637)).
(c) Activities.—The matters to be addressed by the Kids Online Safety Council shall include—
(2) recommending measures and methods for assessing, preventing, and mitigating harms to minors online;
(d) Non-applicability of FACA.—The Kids Online Safety Council shall not be subject to chapter 10 of title 5, United States Code (commonly referred to as the “Federal Advisory Committee Act”).
Except as otherwise provided in this subtitle, this subtitle shall take effect on the date that is 18 months after the date of enactment of this Act.
(a) Relationship to other laws.—Nothing in this subtitle shall be construed to—
(1) preempt section 444 of the General Education Provisions Act (20 U.S.C. 1232g, commonly known as the “Family Educational Rights and Privacy Act of 1974”) or other Federal or State laws governing student privacy;
(2) preempt the Children's Online Privacy Protection Act of 1998 (15 U.S.C. 6501 et seq.) or any rule or regulation promulgated under such Act;
(3) authorize any action that would conflict with section 18(h) of the Federal Trade Commission Act (15 U.S.C. 57a(h)); or
(4) expand or limit the scope of section 230 of the Communications Act of 1934 (commonly known as “section 230 of the Communications Decency Act of 1996”) (47 U.S.C. 230).
(b) Determination of “fairly implied on the basis of objective circumstances”.—For purposes of enforcing this subtitle, in making a determination as to whether covered platform has knowledge fairly implied on the basis of objective circumstances that a specific user is a minor, the Federal Trade Commission or a State attorney general shall rely on competent and reliable evidence, taking into account the totality of the circumstances, including whether a reasonable and prudent person under the circumstances would have known that the user is a minor.
(c) Protections for privacy.—Nothing in this subtitle, including a determination described in subsection (b), shall be construed to require—
(d) Compliance.—Nothing in this subtitle shall be construed to restrict a covered platform's ability to—
(1) cooperate with law enforcement agencies regarding activity that the covered platform reasonably and in good faith believes may violate Federal, State, or local laws, rules, or regulations;
(e) Application to video streaming services.—A video streaming service shall be deemed to be in compliance with this subtitle if it predominantly consists of news, sports, entertainment, or other video programming content that is preselected by the provider and not user-generated, and—
(1) any chat, comment, or interactive functionality is provided incidental to, directly related to, or dependent on provision of such content;
(2) if such video streaming service requires account owner registration and is not predominantly news or sports, the service includes the capability—
(B) to limit the automatic playing of on-demand content selected by a personalized recommendation system for an individual that the service knows is a minor;
(C) for a parent to manage a minor’s privacy and account settings, and restrict purchases and financial transactions by a minor, where applicable;
(E) to offer a clear, conspicuous, and easy-to-understand notice of its policies and practices with respect to the capabilities described in this paragraph; and
(F) when providing on-demand content, to employ measures that safeguard against serving advertising for narcotic drugs (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, gambling, or alcohol directly to the account or profile of an individual that the service knows is a minor.
In this subtitle:
(1) ALGORITHMIC RANKING SYSTEM.—The term “algorithmic ranking system” means a computational process, including one derived from algorithmic decision-making, machine learning, statistical analysis, or other data processing or artificial intelligence techniques, used to determine the selection, order, relative prioritization, or relative prominence of content from a set of information that is provided to a user on an online platform, including the ranking of search results, the provision of content recommendations, the display of social media posts, or any other method of automated content selection.
(2) APPROXIMATE GEOLOCATION INFORMATION.—The term “approximate geolocation information” means information that identifies the location of an individual, but with a precision of less than 5 miles.
(4) CONNECTED DEVICE.—The term “connected device” means an electronic device that—
(5) INPUT-TRANSPARENT ALGORITHM.—
(A) IN GENERAL.—The term “input-transparent algorithm” means an algorithmic ranking system that does not use the user-specific data of a user to determine the selection, order, relative prioritization, or relative prominence of information that is furnished to such user on an online platform, unless the user-specific data is expressly provided to the platform by the user for such purpose.
(B) DATA EXPRESSLY PROVIDED TO THE PLATFORM.—For purposes of subparagraph (A), user-specific data that is provided by a user for the express purpose of determining the selection, order, relative prioritization, or relative prominence of information that is furnished to such user on an online platform—
(i) shall include user-supplied search terms, filters, speech patterns (if provided for the purpose of enabling the platform to accept spoken input or selecting the language in which the user interacts with the platform), saved preferences, the resumption of a previous search, and the current precise geolocation information that is supplied by the user;
(iii) shall include data submitted to the platform by the user that expresses the user's desire to receive particular information, such as the social media profiles the user follows, the video channels the user subscribes to, or other content or sources of content on the platform the user has selected;
(6) ONLINE PLATFORM.—The term “online platform” means any public-facing website, online service, online application, or mobile application that predominantly provides a community forum for user-generated content, such as sharing videos, images, games, audio files, or other content, including a social media service, social network, or virtual reality environment.
(7) OPAQUE ALGORITHM.—
(A) IN GENERAL.—The term “opaque algorithm” means an algorithmic ranking system that determines the selection, order, relative prioritization, or relative prominence of information that is furnished to such user on an online platform based, in whole or part, on user-specific data that was not expressly provided by the user to the platform for such purpose.
(a) In general.—Beginning on the date that is 1 year after the date of enactment of this Act, it shall be unlawful for any person to operate an online platform that uses an opaque algorithm unless the person complies with the requirements of subsection (b).
(b) Opaque algorithm requirements.—
(1) IN GENERAL.—The requirements of this subsection with respect to a person that operates an online platform that uses an opaque algorithm are the following:
(A) The person provides users of the platform with the following notices:
(i) Notice that the platform uses an opaque algorithm that uses user-specific data to select the content the user sees. Such notice shall be presented in a clear and conspicuous manner on the platform whenever the user interacts with an opaque algorithm for the first time, and may be a one-time notice that can be dismissed by the user.
(ii) Notice, to be included in the terms and conditions of the online platform, in a clear, accessible, and easily comprehensible manner that is to be updated whenever the online platform makes a material change, of—
(II) how any user-specific data used by the algorithm is collected or inferred about a user of the platform, and the categories of such data;
(2) RULE OF CONSTRUCTION.—Nothing in this subsection shall be construed to require an online platform to disclose any information, including data or algorithms—
(3) PROHIBITION ON DIFFERENTIAL PRICING.—An online platform shall not deny, charge different prices or rates for, or condition the provision of a service or product to a user based on the user’s election to use an input-transparent algorithm in their use of the platform, as provided under paragraph (1)(B).
(c) Enforcement by Federal Trade Commission.—
(1) UNFAIR OR DECEPTIVE ACTS OR PRACTICES.—A violation of this section by an operator of an online platform shall be treated as a violation of a rule defining an unfair or deceptive act or practice prescribed under section 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)).
(2) POWERS OF COMMISSION.—
(A) IN GENERAL.—The Federal Trade Commission shall enforce this section in the same manner, by the same means, and with the same jurisdiction, powers, and duties as though all applicable terms and provisions of the Federal Trade Commission Act (15 U.S.C. 41 et seq.) were incorporated into and made a part of this section.
(B) PRIVILEGES AND IMMUNITIES.—Any person who violates this section shall be subject to the penalties and entitled to the privileges and immunities provided in the Federal Trade Commission Act (15 U.S.C. 41 et seq.).
(d) Rule of construction to preserve personalized blocks.—Nothing in this section shall be construed to limit or prohibit an online platform’s ability to, at the direction of an individual user or group of users, restrict another user from searching for, finding, accessing, or interacting with such user’s or group’s account, content, data, or online community.
The provisions of this title shall preempt any State law, rule, or regulation only to the extent that such State law, rule, or regulation conflicts with a provision of this title. Nothing in this title shall be construed to prohibit a State from enacting a law, rule, or regulation that provides greater protection to minors than the protection provided by the provisions of this title.
(a) Definitions.—Section 1302 of the Children’s Online Privacy Protection Act of 1998 (15 U.S.C. 6501) is amended—
(1) by amending paragraph (2) to read as follows:
“(2) OPERATOR.—The term ‘operator’—
“(A) means any person—
“(i) who, for commercial purposes, in interstate or foreign commerce operates or provides a website on the internet, an online service, an online application, or a mobile application; and
“(ii) who—
“(I) collects or maintains, either directly or through a service provider, personal information from or about the users of that website, service, or application;
“(B) does not include any nonprofit entity that would otherwise be exempt from coverage under section 5 of the Federal Trade Commission Act (15 U.S.C. 45).”;
(2) in paragraph (4)—
(A) by amending subparagraph (A) to read as follows:
“(A) the release of personal information collected from a child or teen by an operator for any purpose, except where the personal information is provided to a person other than an operator who—
(3) by striking paragraph (8) and inserting the following:
“(8) PERSONAL INFORMATION.—
“(A) IN GENERAL.—The term ‘personal information’ means individually identifiable information about an individual collected online, including—
“(vi) any other identifier that the Commission determines permits the physical or online contacting of a specific individual;
“(vii) a persistent identifier that can be used to recognize a specific child or teen over time and across different websites, online services, online applications, or mobile applications, including but not limited to a customer number held in a cookie, an Internet Protocol (IP) address, a processor or device serial number, or unique device identifier, but excluding an identifier that is used by an operator solely for providing support for the internal operations of the website, online service, online application, or mobile application;
“(viii) a photograph, video, or audio file where such file contains a specific child's or teen's image or voice;
“(B) EXCLUSION.—The term ‘personal information’ shall not include an audio file that contains a child's or teen’s voice so long as the operator—
“(i) does not request information via voice that would otherwise be considered personal information under this paragraph;
“(ii) provides clear notice of its collection and use of the audio file and its deletion policy in its privacy policy;
“(C) SUPPORT FOR THE INTERNAL OPERATIONS OF A WEBSITE, ONLINE SERVICE, ONLINE APPLICATION, OR MOBILE APPLICATION.—
“(i) IN GENERAL.—For purposes of subparagraph (A)(vii), the term ‘support for the internal operations of a website, online service, online application, or mobile application’ means those activities necessary to—
“(I) maintain or analyze the functioning of the website, online service, online application, or mobile application;
“(III) authenticate users of, or personalize the content on, the website, online service, online application, or mobile application;
“(IV) serve contextual advertising, provided that any persistent identifier is only used as necessary for technical purposes to serve the contextual advertisement, or cap the frequency of advertising;
“(ii) CONDITION.—Except as specifically permitted under clause (i), information collected for the activities listed in clause (i) cannot be used or disclosed to contact a specific individual, including through individual-specific advertising to children or teens, to amass a profile on a specific individual, in connection with processes that encourage or prompt use of a website or online service, or for any other purpose.”;
(4) by amending paragraph (9) to read as follows:
“(9) VERIFIABLE CONSENT.—The term ‘verifiable consent’ means any reasonable effort (taking into consideration available technology), including a request for authorization for future collection, use, and disclosure described in the notice, to ensure that, in the case of a child, a parent of the child, or, in the case of a teen, the teen—
(5) in paragraph (10)—
(A) in the paragraph header, by striking “Website or online service directed to children” and inserting “Website, online service, online application, or mobile application directed to children”;
(B) by striking “website or online service” each place it appears and inserting “website, online service, online application, or mobile application”; and
(C) by adding at the end the following new subparagraph:
“(C) RULE OF CONSTRUCTION.—In considering whether a website, online service, online application, or mobile application, or portion thereof, is directed to children, the Commission shall apply a totality of circumstances test and will also consider competent and reliable empirical evidence regarding audience composition and evidence regarding the intended audience of the website, online service, online application, or mobile application.”; and
(6) by adding at the end the following:
“(13) CONNECTED DEVICE.—The term ‘connected device’ means a device that is capable of connecting to the internet, directly or indirectly, or to another connected device.
“(15) MOBILE APPLICATION.—The term ‘mobile application’—
“(16) GEOLOCATION INFORMATION.—The term ‘geolocation information’ means information sufficient to identify a street name and name of a city or town.
“(18) INDIVIDUAL-SPECIFIC ADVERTISING TO CHILDREN OR TEENS.—
“(A) IN GENERAL.—The term ‘individual-specific advertising to children or teens’ means advertising or any other effort to market a product or service that is directed to a specific child or teen or a connected device that is linked or reasonably linkable to a child or teen based on—
“(B) EXCLUSIONS.—The term ‘individual-specific advertising to children or teens’ shall not include—
“(i) advertising or marketing to an individual or the device of an individual in response to the individual’s specific request for information or feedback, such as a child's or teen's current search query;
“(C) RULE OF CONSTRUCTION.—Nothing in subparagraph (A) shall be construed to prohibit an operator with actual knowledge or knowledge fairly implied on the basis of objective circumstances that a user is under the age of 17 from delivering advertising or marketing that is age-appropriate and intended for a child or teen audience, so long as the operator does not use any personal information other than whether the user is under the age of 17.”.
(b) Online collection, use, disclosure, and deletion of personal information of children and teens.—Section 1303 of the Children’s Online Privacy Protection Act of 1998 (15 U.S.C. 6502) is amended—
(1) by striking the heading and inserting the following: “Online collection, use, disclosure, and deletion of personal information of children and teens.”;
(2) in subsection (a)—
(A) by amending paragraph (1) to read as follows:
“(1) IN GENERAL.—It is unlawful for an operator of a website, online service, online application, or mobile application directed to children or for any operator of a website, online service, online application, or mobile application with actual knowledge or knowledge fairly implied on the basis of objective circumstances that a user is a child or teen—
“(A) to collect personal information from a child or teen in a manner that violates the regulations prescribed under subsection (b);
“(B) except as provided in subparagraphs (B) and (C) of section 1302(18), to collect, use, disclose to third parties, or maintain personal information of a child or teen for purposes of individual-specific advertising to children or teens (or to allow another person to collect, use, disclose, or maintain such information for such purpose);
“(C) to collect the personal information of a child or teen except when the collection of the personal information is—
“(D) to store or transfer the personal information of a child or teen outside of the United States unless the operator provides direct notice to the parent of the child, in the case of a child, or to the teen, in the case of a teen, that the child's or teen's personal information is being stored or transferred outside of the United States; or
(3) in subsection (b)—
(A) in paragraph (1)—
(i) in subparagraph (A)—
(I) by striking “operator of any website” and all that follows through “from a child” and inserting “operator of a website, online service, online application, or mobile application directed to children or that has actual knowledge or knowledge fairly implied on the basis of objective circumstances that a user is a child or teen”;
(II) in clause (i)—
(aa) by striking “notice on the website” and inserting “clear and conspicuous notice on the website”;
(dd) by striking “; and” and inserting “, the rights and opportunities available to the parent of the child or teen under subparagraphs (B) and (C), and the procedures or mechanisms the operator uses to ensure that personal information is not collected from children or teens except in accordance with the regulations promulgated under this paragraph;”;
(IV) by inserting after clause (ii) the following new clause:
“(iii) to obtain verifiable consent from a parent of a child or from a teen before using or disclosing personal information of the child or teen for any purpose that is a material change from the original purposes and disclosure practices specified to the parent of the child or the teen under clause (i);”;
(ii) in subparagraph (B)—
(I) in the matter preceding clause (i), by striking “website or online service” and inserting “operator”;
(II) in clause (i), by inserting “and the method by which the operator obtained the personal information, and the purposes for which the operator collects, uses, discloses, and retains the personal information” before the semicolon;
(III) in clause (ii)—
(iv) by inserting after subparagraph (B) the following new subparagraph:
“(C) require the operator to provide, upon the request of a teen under this subparagraph who has provided personal information to the operator, upon proper identification of that teen—
“(i) a description of the specific types of personal information collected from the teen by the operator, the method by which the operator obtained the personal information, and the purposes for which the operator collects, uses, discloses, and retains the personal information;
“(ii) the opportunity at any time to delete personal information collected from the teen or content or information submitted by the teen to a website, online service, online application, or mobile application and to refuse to permit the operator's further use or maintenance in retrievable form, or online collection, of personal information from the teen;
(B) in paragraph (2)—
(i) in the matter preceding subparagraph (A), by striking “verifiable parental consent” and inserting “verifiable consent”;
(iv) in subparagraph (C)—
(I) in the matter preceding clause (i), by inserting “or teen” after “child” each place the term appears;
(C) by redesignating paragraph (3) as paragraph (4) and inserting after paragraph (2) the following new paragraph:
“(3) APPLICATION TO OPERATORS ACTING UNDER AGREEMENTS WITH EDUCATIONAL AGENCIES OR INSTITUTIONS.—The regulations may provide that verifiable consent under paragraph (1)(A)(ii) is not required for an operator that is acting under a written agreement with an educational agency or institution (as defined in section 444 of the General Education Provisions Act (commonly known as the ‘Family Educational Rights and Privacy Act of 1974’) (20 U.S.C. 1232g(a)(3)) that, at a minimum, requires the—
“(A) operator to—
“(i) limit its collection, use, and disclosure of the personal information from a child or teen to solely educational purposes and for no other commercial purposes;
“(ii) provide the educational agency or institution with a notice of the specific types of personal information the operator will collect from the child or teen, the method by which the operator will obtain the personal information, and the purposes for which the operator will collect, use, disclose, and retain the personal information;
“(iii) provide the educational agency or institution with a link to the operator’s online notice of information practices as required under subsection (b)(1)(A)(i); and
“(iv) provide the educational agency or institution, upon request, with a means to review the personal information collected from a child or teen, to prevent further use or maintenance or future collection of personal information from a child or teen, and to delete personal information collected from a child or teen or content or information submitted by a child or teen to the operator’s website, online service, online application, or mobile application;
“(B) representative of the educational agency or institution to acknowledge and agree that they have authority to authorize the collection, use, and disclosure of personal information from children or teens on behalf of the educational agency or institution, along with such authorization, their name, and title at the educational agency or institution; and
“(C) educational agency or institution to—
“(i) provide on its website a notice that identifies the operator with which it has entered into a written agreement under this subsection and provides a link to the operator’s online notice of information practices as required under paragraph (1)(A)(i);
“(ii) provide the operator’s notice regarding its information practices, as required under subparagraph (A)(ii), upon request, to a parent, in the case of a child, or a parent or teen, in the case of a teen; and
“(iii) upon the request of a parent, in the case of a child, or a parent or teen, in the case of a teen, request the operator provide a means to review the personal information from the child or teen and provide the parent, in the case of a child, or parent or teen, in the case of the teen, a means to review the personal information.”;
(D) by amending paragraph (4), as so redesignated, to read as follows:
“(4) TERMINATION OF SERVICE.—The regulations shall permit the operator of a website, online service, online application, or mobile application to terminate service provided to a child whose parent has refused, or a teen who has refused, under the regulations prescribed under paragraphs (1)(B)(ii) and (1)(C)(ii), to permit the operator’s further use or maintenance in retrievable form, or future online collection of, personal information from that child or teen.”; and
(E) by adding at the end the following new paragraphs:
“(5) CONTINUATION OF SERVICE.—The regulations shall prohibit an operator from discontinuing service provided to a child or teen on the basis of a request by the parent of the child or by the teen, under the regulations prescribed under subparagraph (B) or (C) of paragraph (1), respectively, to delete personal information collected from the child or teen, to the extent that the operator is capable of providing such service without such information.
“(6) RULE OF CONSTRUCTION.—A request made pursuant to subparagraph (B) or (C) of paragraph (1) to delete or correct personal information of a child or teen shall not be construed—
“(A) to limit the authority of a law enforcement agency to obtain any content or information from an operator pursuant to a lawfully executed warrant or an order of a court of competent jurisdiction;
“(B) to require an operator or third party delete or correct information that—
“(i) any other provision of Federal or State law requires the operator or third party to maintain; or
“(ii) was submitted to the website, online service, online application, or mobile application of the operator by any person other than the user who is attempting to erase or otherwise eliminate the content or information, including content or information submitted by the user that was republished or resubmitted by another person; or
“(C) to prohibit an operator from—
“(i) retaining a record of the deletion request and the minimum information necessary for the purposes of ensuring compliance with a request made pursuant to subparagraph (B) or (C);
“(ii) preventing, detecting, protecting against, or responding to security incidents, identity theft, or fraud, or reporting those responsible for such actions;
“(7) COMMON VERIFIABLE CONSENT MECHANISM.—
“(A) IN GENERAL.—
“(i) FEASIBILITY OF MECHANISM.—The Commission shall assess the feasibility, with notice and public comment, of allowing operators the option to use a common verifiable consent mechanism that fully meets the requirements of this title.
“(ii) REQUIREMENTS.—The feasibility assessment described in clause (i) shall consider whether a single operator could use a common verifiable consent mechanism to obtain verifiable consent, as required under this title, from a parent of a child or from a teen on behalf of multiple, listed operators that provide a joint or related service.
“(B) REPORT.—Not later than 1 year after the date of enactment of this paragraph, the Commission shall submit a report to the Committee on Commerce, Science, and Transportation of the Senate and the Committee on Energy and Commerce of the House of Representatives with the findings of the assessment required by subparagraph (A).
“(C) REGULATIONS.—If the Commission finds that the use of a common verifiable consent mechanism is feasible and would meet the requirements of this title, the Commission shall issue regulations to permit the use of a common verifiable consent mechanism in accordance with the findings outlined in such report.”;
(4) in subsection (c), by striking “a regulation prescribed under subsection (a)” and inserting “subparagraph (B), (C), (D), or (E) of subsection (a)(1), or of a regulation prescribed under subsection (b),”; and
(5) by striking subsection (d) and inserting the following:
“(d) Relationship to State law.—The provisions of this title shall preempt any State law, rule, or regulation only to the extent that such State law, rule, or regulation conflicts with a provision of this title. Nothing in this title shall be construed to prohibit any State from enacting a law, rule, or regulation that provides greater protection to children or teens than the provisions of this title.”.
(c) Safe harbors.—Section 1304 of the Children’s Online Privacy Protection Act of 1998 (15 U.S.C. 6503) is amended—
(2) by adding at the end the following:
“(d) Publication.—
“(1) IN GENERAL.—Subject to the restrictions described in paragraph (2), the Commission shall publish on the internet website of the Commission any report or documentation required by regulation to be submitted to the Commission to carry out this section.
“(2) RESTRICTIONS ON PUBLICATION.—The restrictions described in section 6(f) and section 21 of the Federal Trade Commission Act (15 U.S.C. 46(f), 57b–2) applicable to the disclosure of information obtained by the Commission shall apply in same manner to the disclosure under this subsection of information obtained by the Commission from a report or documentation described in paragraph (1).”.
(d) Actions by States.—Section 1305 of the Children’s Online Privacy Protection Act of 1998 (15 U.S.C. 6504) is amended—
(e) Administration and applicability of Act.—Section 1306 of the Children’s Online Privacy Protection Act of 1998 (15 U.S.C. 6505) is amended—
(1) in subsection (b)—
(A) in paragraph (1), by striking “, in the case of” and all that follows through “the Board of Directors of the Federal Deposit Insurance Corporation;” and inserting the following: “by the appropriate Federal banking agency, with respect to any insured depository institution (as those terms are defined in section 3 of that Act (12 U.S.C. 1813));”; and
(3) by adding at the end the following new subsections:
“(f) Determination of whether an operator has knowledge fairly implied on the basis of objective circumstances.—
“(1) RULE OF CONSTRUCTION.—For purposes of enforcing this title or a regulation promulgated under this title, in making a determination as to whether an operator has knowledge fairly implied on the basis of objective circumstances that a specific user is a child or teen, the Commission or State attorneys general shall rely on competent and reliable evidence, taking into account the totality of the circumstances, including whether a reasonable and prudent person under the circumstances would have known that the user is a child or teen. Nothing in this title, including a determination described in the preceding sentence, shall be construed to require an operator to—
“(2) COMMISSION GUIDANCE.—
“(A) IN GENERAL.—Within 180 days of enactment, the Commission shall issue guidance to provide information, including best practices and examples for operators to understand the Commission’s determination of whether an operator has knowledge fairly implied on the basis of objective circumstances that a user is a child or teen.
“(B) LIMITATION.—No guidance issued by the Commission with respect to this title shall confer any rights on any person, State, or locality, nor shall operate to bind the Commission or any person to the approach recommended in such guidance. In any enforcement action brought pursuant to this title, the Commission or State attorney general, as applicable, shall allege a specific violation of a provision of this title. The Commission or State attorney general, as applicable, may not base an enforcement action on, or execute a consent order based on, practices that are alleged to be inconsistent with any such guidance, unless the practices allegedly violate this title. For purposes of enforcing this title or a regulation promulgated under this title, State attorneys general shall take into account any guidance issued by the Commission under subparagraph (A).
“(g) Additional requirement.—Any regulations issued under this title shall include a description and analysis of the impact of proposed and final Rules on small entities per the Regulatory Flexibility Act of 1980 (5 U.S.C. 601 et seq.).”.
(a) Oversight report.—Not later than 3 years after the date of enactment of this Act, the Federal Trade Commission shall submit to the Committee on Commerce, Science, and Transportation of the Senate and the Committee on Energy and Commerce of the House of Representatives a report on the processes of platforms that offer mobile and online applications for ensuring that, of those applications that are websites, online services, online applications, or mobile applications directed to children, the applications operate in accordance with—
(2) rules promulgated by the Commission under section 18 of the Federal Trade Commission Act (15 U.S.C. 57a) relating to unfair or deceptive acts or practices in marketing.
(b) Enforcement report.—Not later than 1 year after the date of enactment of this Act, and each year thereafter, the Federal Trade Commission shall submit to the Committee on Commerce, Science, and Transportation of the Senate and the Committee on Energy and Commerce of the House of Representatives a report that addresses, at a minimum—
(1) the number of actions brought by the Commission during the reporting year to enforce the Children’s Online Privacy Protection Act of 1998 (15 U.S.C. 6501) (referred to in this subsection as the “Act”) and the outcome of each such action;
(2) the total number of investigations or inquiries into potential violations of the Act; during the reporting year;
(3) the total number of open investigations or inquiries into potential violations of the Act as of the time the report is submitted;
(a) Study.—The Comptroller General of the United States (in this section referred to as the “Comptroller General”) shall conduct a study on the privacy of teens who use financial technology products. Such study shall—
(b) Report.—Not later than 1 year after the date of enactment of this section, the Comptroller General shall submit to Congress a report containing the results of the study conducted under subsection (a), together with recommendations for such legislation and administrative action as the Comptroller General determines appropriate.
(a) In general.—Section 1125 of title 31, United States Code, is amended—
(2) by striking subsections (a) and (b) and inserting the following:
“(a) Definitions.—In this section:
“(1) BUDGET JUSTIFICATION MATERIALS.—The term ‘budget justification materials’ has the meaning given the term in section 3(b)(2) of the Federal Funding Accountability and Transparency Act of 2006 (31 U.S.C. 6101 note; Public Law 109–282).
“(2) PLAN OR REPORT.—The term ‘plan or report’ means any plan or report submitted to Congress, any committee of Congress, or subcommittee thereof, by not less than 1 agency—
“(3) RECURRING PLAN OR REPORT.—The term ‘recurring plan or report’ means a plan or report submitted on a recurring basis.
“(b) Agency identification of unnecessary reports.—
“(1) IN GENERAL.—The head of each agency shall include in the budget justification materials of the agency the following:
“(A) Subject to paragraphs (2) and (3), the following:
“(ii) An identification of whether the recurring plan or report listed in clause (i) was included in the most recent report issued by the Clerk of the House of Representatives concerning the reports that any agency is required by law or directed or requested by a committee report to make to Congress, any committee of Congress, or subcommittee thereof.
“(iii) If applicable, the unique alphanumeric identifier for the recurring plan or report as required by section 7243(b)(1)(C)(vii) of the James M. Inhofe National Defense Authorization Act for Fiscal Year 2023 (Public Law 117–263).
“(B) With respect to each recurring plan or report identified in subparagraph (A)(iv), the following:
“(i) A recommendation on whether to sunset, modify, consolidate, or reduce the frequency of the submission of the recurring plan or report.
“(C) A justification explaining, with respect to each recommendation described in subparagraph (B)(i) relating to a recurring plan or report—
“(2) AGENCY CONSULTATION.—
“(A) IN GENERAL.—In preparing the list required under paragraph (1)(A), if, in submitting a recurring plan or report, an agency is required to coordinate or consult with another agency or entity, the head of the agency submitting the recurring plan or report shall consult with the head of each agency or entity with whom consultation or coordination is required.
“(B) INCLUSION IN LIST.—If, after a consultation under subparagraph (A), the head of each agency or entity consulted under that subparagraph agrees that a recurring plan or report is outdated or duplicative, the head of the agency required to submit the recurring plan or report shall—
“(C) DISAGREEMENT.—If the head of any agency or entity consulted under subparagraph (A) does not agree that a recurring plan or report is outdated or duplicative, the head of the agency required to submit the recurring plan or report shall not include the recurring plan or report in the list described in paragraph (1)(A).
“(3) GOVERNMENT-WIDE OR MULTI-AGENCY PLAN AND REPORT SUBMISSIONS.—With respect to a recurring plan or report required to be submitted by not less than 2 agencies, the Director of the Office of Management and Budget shall—
“(4) PLAN AND REPORT SUBMISSIONS CONFORMITY TO THE ACCESS TO CONGRESSIONALLY MANDATED REPORTS ACT.—With respect to an agency recommendation, citation, or justification made under subparagraph (B) or (C) of paragraph (1) or a recommendation by the Director of the Office of Management and Budget under paragraph (3), the agency or Director, as applicable, shall also provide this information to the Director of the Government Publishing Office in conformity with the agency submission requirements under section 7244(a) of the James M. Inhofe National Defense Authorization Act for Fiscal Year 2023 (Public Law 117–263; chapter 41 of title 44 note) in conformity with guidance issued by the Director of the Office of Management and Budget under section 7244(b) of such Act.
(b) Budget contents.—Section 1105(a) of title 31, United States Code, is amended by striking paragraph (39).
(c) Conformity to the access to Congressionally Mandated Reports Act.—
(1) AMENDMENT.—Subsections (a) and (b) of section 7244 of the James M. Inhofe National Defense Authorization Act for Fiscal Year 2023 (Public Law 117–263; chapter 41 of title 44, United States Code, note), are amended to read as follows:
“(a) Submission of electronic copies of reports.—Not earlier than 30 days or later than 60 days after the date on which a congressionally mandated report is submitted to either House of Congress or to any committee of Congress or subcommittee thereof, the head of the Federal agency submitting the congressionally mandated report shall submit to the Director the information required under subparagraphs (A) through (D) of section 7243(b)(1) with respect to the congressionally mandated report. Notwithstanding section 7246, nothing in this subtitle shall relieve a Federal agency of any other requirement to publish the congressionally mandated report on the online portal of the Federal agency or otherwise submit the congressionally mandated report to Congress or specific committees of Congress, or subcommittees thereof.
“(b) Guidance.—Not later than 180 days after the date of the enactment of this subsection and periodically thereafter as appropriate, the Director of the Office of Management and Budget, in consultation with the Director, shall issue guidance to agencies on the implementation of this subtitle as well as the requirements of section 1125(b) of title 31, United States Code.”.
(2) UPDATED OMB GUIDANCE.—Not later than 180 days after the date of the enactment of this Act, the Director of the Office of Management and Budget shall issue updated guidance to agencies to ensure that the requirements under subsections (a) and (b) of section 1125 of title 31, United States Code, as amended by this Act, for agency submissions of recommendations and justifications for plans and reports to sunset, modify, consolidate, or reduce the frequency of the submission of are also submitted as a separate attachment in conformity with the agency submission requirements of electronic copies of reports submitted by agencies under section 7244(a) of the James M. Inhofe National Defense Authorization Act for Fiscal Year 2023 (Public Law 117–263; chapter 41 of title 44, United States Code, note) for publication on the online portal established under section 7243 of such Act.
Amend the title so as to read: “An Act to protect the safety and privacy of children on the internet.”.
Attest:
Secretary
| |||||
SENATE AMENDMENTS TO HOUSE AMENDMENT | |||||