Generative AI and Data Privacy A Primer

Generative-AI-and-Data-Privacy-A-Primer

Since the public release of Open AI’s ChatGPT, Google’s Bard, and other similar systems, some Members of Congress have expressed interest in the risks associated with “generative artificial intelligence (AI).” Although exact definitions vary, generative AI is a type of AI that can generate new content—such as text, images, and videos—through learning patterns from pre-existing data. It is a broad term that may include various technologies and techniques from AI and machine learning (ML).

Generative AI models have received significant attention and scrutiny due to their potential harms, such as risks involving privacy, misinformation, copyright, and non-consensual sexual imagery. This report focuses on privacy issues and relevant policy considerations for Congress. Some policymakers and stakeholders have raised privacy concerns about how individual data may be used to develop and deploy generative models. These concerns are not new or unique to generative AI, but the scale, scope, and capacity of such technologies may present new privacy challenges for Congress.

Generative AI at a Glance

Major Developers and Selected Products:

  • OpenAI (with partnerships and funding from Microsoft)—“ChatGPT” chatbot, “DALL-E” image
    generator
  • Google—“Bard” chatbot
  • Meta—“LLaMA” research tool, “Make-A-Video” video generator
  • Anthropic (founded by former employees of OpenAI)—“Claude” chatbot
  • Stability AI—“Stable Diffusion” image generator
  • Hugging Face—BLOOM language model
  • NVIDIA—“NeMo” chatbot, “Picasso” visual content generator
    Types of Applications:
  • Chatbots—systems that simulate human conversation, often in question-and-answer format
  • Image generators—systems that generate images based on an input or “prompt”
  • Video generators—systems that generate videos based on an input or “prompt,” sometimes called deepfakes
  • Voice clones—systems that generate speech and voice sounds, sometimes called audio deepfakes
Facebook
Twitter
LinkedIn
Pinterest

Leave a Reply

Your email address will not be published. Required fields are marked *