Privacy Laws

Several laws have been passed or are in various stages of development in the U.S. that affect consumer privacy. Two of the earliest privacy laws are the Freedom of Information Act (FOIA) and the Privacy Act of 1974. These acts allow consumers to make requests for information from government agencies, such as the FBI and the Department of Justice. The Privacy Act allows you to obtain your own records and amend or delete information about you that is inaccurate, irrelevant, outdated, or incomplete. You have the right to sue the agency if it refuses to correct or amend your record, or if it refuses to give you access to it. The FOIA applies only to federal agencies and does not create a right of access to records held by Congress, the courts, or state or local government agencies. Each state has its own public access laws. The FOIA requires an agency to decide within ten working days whether to comply with an FOIA request and to inform the person making the request.

Several differences exist between the two acts:

  • Only U.S. citizens or permanent resident aliens can obtain access to records that can be retrieved from a system of records by your name, a number, a symbol, or some other identifying particular that is assigned to you, according to the Privacy Act. The FOIA allows any person to obtain access to any records, as stated previously.

  • The Privacy Act carries broader exemptions than the FOIA does. Some law enforcement agencies can refuse your FOIA requests.

  • The Privacy Act permits an agency to charge requesters for copying, but not for search costs.

  • The statute of limitations for filing a lawsuit is only two years under the Privacy Act, but is six years under the FOIA.

Privacy advocates have made wide use of these acts in the past several years to promote individual privacy and learn how the government tracks consumer information.

One law that got passed but was determined to be unconstitutional in the U.S. is the Communications Decency Act (CDA), which was passed as part of the Telecommunications Act of 1996. The CDA prohibited Internet users from using the Internet to communicate material that, under contemporary community standards, would be deemed patently offensive to minors under the age of 18. The Supreme Court struck down this act on First Amendment grounds. The CDA provided two affirmative defenses to prosecution: (1) the use of a credit card or other age verification system, and (2) any good faith effort to restrict access by minors. Key terms were not defined in this act and were deemed unconstitutionally vague. The court found that this act was “wholly unprecedented” in that, for example, it was “not limited to commercial speech or commercial entities…[but rather] its open-ended prohibitions embrace all nonprofit entities and individuals posting indecent messages or displaying them on their own computers.” The actions required on the part of Internet users and companies would be cost-prohibitive to doing business and technology was not yet developed to enable the restrictions designed by the act.

In response to the Supreme Court's decision on the CDA, the Children's Online Privacy Protection Act (COPPA) was enacted into law on October 21, 1998. This act subjected commercial Web publishers to the restrictions of ensuring that minors could not access harmful material on the Web. This act uses defined terms and is much smaller in scope than the CDA. However, the Third Circuit Court of Appeals found the COPPA to be unconstitutional. The Department of Justice has filed a petition for certiorari asking the U.S. Supreme Court to reverse the decision of the Third Circuit Court of Appeals. This appeal “presents a conflict between one of society's most cherished rights—freedom of expression—and one of the government's most profound obligations—the protection of minors.” The COPPA is supposed to protect minors by using “contemporary community standards” to keep minors from viewing material knowingly posted on the World Wide Web for commercial purposes. The court found that because the Web is accessible by all Internet users worldwide, and because current technology does not permit a Web publisher to restrict access to its site based on the geographic locale, it is not possible for the Web publisher to conform to the act for each community where access to the Internet is available. The court also found that the act imposes an impermissible burden on constitutionally protected First Amendment speech. Current technology is not available to enact the law.

A similar law tries to protect the public use of the Internet in schools and libraries from questionable material. The Children's Internet Protection Act (CHIPA), which is supposed to block access to material from both adults and minors in schools and libraries, is being contested as being unconstitutional. Beyond the constitutional question is the question of how the technology is used. Blocking technology is not anywhere near perfect, and users will be blocked from resources they should have legitimate access to under the law. The EFF, along with the Online Policy Group (http://www.onlinepolicy.org) and the American Civil Liberties Union (http://www.aclu.org), have been leading the protests against the law.

A new attack on your privacy that has the force of law behind it is the Financial Services Modernization Act (also known as the Gramm-Leach-Bliley Act, or GLB). This act allows banks, insurance companies, and brokerage firms to operate as one. Companies now have the ability to merge customer data from several sources and sell it to third parties. Think about all the personal information collected through your bank, brokerage house, and insurance company. All your financial information, the companies you invest in, and your medical history can be aggregated into one large profile about you. Even before this act was passed in 2000, there were few restrictions on a financial institution's ability to share or sell your personal information.

Three key notices are required by GLB for financial institutions for consumers:

  • Privacy policy— The types of information collected about you and how it uses that information must be disclosed.

  • Right to opt-out— Institutions must explain your ability to prevent the sale of your customer data to third parties.

  • Safeguards— Institutions are required to develop policies to prevent fraudulent access to confidential financial information.

Privacy notices will be sent from your institution to you from each company you belong to. Notices should arrive by July 1, 2001. Consumers are entitled to a “reasonable” time to respond before personal data can be disclosed. You must return the notice so that it reaches the company within 30 days after it was sent to you to opt out of the dissemination of information. You then have to follow precise instructions on how to inform the company that you don't want it to sell your information. You might receive more than 10 notices that you must read through and opt out of to keep your data secure. The law and regulations require only that you get a notice of the categories of information the financial institution collects and the categories of information that might be sold or shared with a third party; the details are not required by law to be sent to you. GLB and federal regulations only keep financial institutions from disclosing your account number or access code to a nonaffiliated third party.

Unless you opt out, sensitive information such as details about your health and treatments can be disclosed to a nonaffiliated third party or even sold to outside marketing companies. The status of these medical privacy rules is ever changing. The Bush administration is waiting on a study to determine what action to take. Individual states have passed their own laws, such as the one passed by California that makes it a crime for an insurance company to sell information to a financial institution for the purpose of granting credit (AB 2797 in the 2000 legislative session, California Civil Code 56.26). But the law does not cover information that flows from a financial institution to an insurance company. Aggregation of information can still occur from nonaffiliated third parties, consumer reporting agencies, or public records, all of which are not covered by the law.

Under GLB, a company can share your personal information with its affiliates, but you can opt out under the Fair Credit Reporting Act (FCRA). This law gives you the right to prevent a company from sharing information about your credit worthiness and information from your applications with an affiliate. Under federal rules, a credit reporting agency (CRA, such as Equifax, Experian, and Trans Union) can't sell so-called “credit header” information to third parties (your name, address, phone number, age, and Social Security number) unless your bank has given you the right to opt out. But, if you do not know you have the right to opt out of such activities, institutions will have free reign to do as they please with your information. GLB does not contain what is called a private right of action, so you have no recourse to take an agency to court under federal law; however, state laws might differ. Trying to opt out of the dissemination under GLB is almost a lost cause. A number of loopholes are in the act that allow your information to be traded like baseball cards. This does not paint a rosy picture for the future of our privacy.

One standard that is in the fledgling stages of acceptance for privacy is Safe Harbor, an international privacy agreement that took effect in 2000. It marks the line between acceptable privacy practices in Europe and the United States and is the result of an agreement between the U.S. Department of Commerce and the European Commission. The European Union is more concerned with privacy than the U.S. and has passed more stringent laws than the U.S. Safe Harbor governs the transatlantic flow of data. The agreement sets up a framework for certifying companies collecting data under privacy protection standards and is visible in the form of a new privacy-seal program. TRUSTe launched the new program to stamp a seal of approval for a company that complies with the EU Safe Harbor certification. The Safe Harbor is a compromise over problems for Web companies that started with the European Commission's Directive on Data Privacy. The directive went into effect in October 1998 and blocks transfer of personal data to non-European Union nations that do not meet the European laws on privacy protection.

U.S. companies who voluntarily enter Safe Harbor will be deemed as having “adequate” privacy protection, and data transfers to those firms can continue. The problem as with all other self-regulation in the U.S. is that no law exists to force companies to comply with Safe Harbor. It does benefit them from a business perspective, which would get many companies to join. After a company gets Safe Harbor status, monitoring the company is left to private-sector groups, such as the Better Business Bureau or the American Arbitration Association.

The Council of the European Union (the 15 EU governments) will be supporting the EU “law enforcement agencies'” request for full access to all telecommunications data to be written into all community legislation in the future. Existing European Union laws will be reexamined with the goal of analyzing data retention (the archiving of all telecommunications for at least seven years). EU member states have been amending national laws on interception and storage of data to combat computer crime. This has the effect of opening up and storing personal data that the law enforcement agencies can have access to. Future laws, including the proposals currently being discussed on the protection of privacy and computer-aided crime should ensure the retention of data, which will greatly affect privacy of personal data. This case of strict EU controls has the same effect as some proposed and already passed U.S. laws in that it leaves personal information open to the government's perusal.

In Australia, the Privacy Amendment (Private Sector) Act 2000 establishes a national scheme for the handling of personal information by private sector organizations. This act is similar to the directives passed by the EU. This act was developed to provide Australian businesses with a framework that will help them in the global information economy to be compatible with the European Union directive on privacy issues. The legislation establishes the National Privacy Principles (NPP), which are a minimum set of privacy standards to regulate the collection, use, disclosure, and transfer of personal information. Organizations are required to keep accurate, up-to-date information and keep the data secure. Companies must also be open about how they manage personal information, provide access and correction rights to individuals, and allow people to deal with them anonymously. As more countries define privacy laws, we will see either more security and control of our data or loss of privacy to government and corporations, depending on which segment does the better lobbying—privacy advocate groups or corporations.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset