Conclusion: Democratizing Access to AI
Artificial intelligence has the potential to improve ordinary people’s lives in countless ways. Democratizing access to AI will make it possible for this transformative technology to benefit everyone.
The authors of this book believe that businesses and research facilities working in the field of AI have a big role to play in making AI more accessible—by sharing the outcomes of their research and development with a broader audience, much as OpenAI has done with GPT-3 in the form of its publicly available API. Making such a powerful tool available at a marginal cost to users in important fields can have a long-lasting positive impact on the world.
To conclude the book, this short chapter will look at how no-code and low-code programming leverage GPT-3 to move from ideas to working products. It’s a great example of how GPT-3 and large language models are changing jobs, economies, and futures. Then we’ll finish up with some takeaways for you to consider as you begin your GPT-3 journey.
No Code? No Problem!
At its simplest, no-code is a way of programming computers—creating websites, mobile apps, programs, or scripts— using a simple interface, instead of writing in a programming language. The no-code movement, often hailed as the “future of coding,” rests upon the fundamental belief that technology should enable and facilitate creation, not act as a barrier to entry for those who want to develop software[29]. The no-code movement’s goal is to make it possible for anyone to create programs and apps that work, without programming skills or specialized equipment. This mission seems to go hand in hand with the evolution of model-as-a-service and the overall trend toward democratizing AI.
As of early 2022, the industry standard for no-code platform tools is Bubble, a pioneering visual-programming language and app-development program that enables users to create full-fledged web applications without writing a single line of code. The ripples from its impact have put a whole new industry in motion. In the words of founder Josh Haas, Bubble is “a
platform where users can just describe in simple language what they want and how they want it and can automate the development without any code.” Haas was inspired, he explains in an interview, by noticing a  “huge mismatch between the number of people who want to create with technology, build websites, build web applications, and the resources available in the form of engineering talent.”
Currently, building, developing, and maintaining an enterprise-level web application (such as Twitter, Facebook, or Airbnb, to name a few of the largest) requires talent with extensive technical expertise. Independent would-be developers who start at the beginner level must learn to code from scratch before actually building anything. That takes time and effort. “It's such a time-consuming process for most people that it poses a huge barrier to entry,” Haas says.
This means that entrepreneurs who don’t have a development, software engineering, or coding background, but who have a great application idea and want to build a company around it, must depend on those who have that expertise– and persuade them to work on the idea.  Haas notes that, as you might expect, “it is very hard to convince someone to work just for equity on an unproven idea, even if it's a good idea.”
In-house talent is crucial, Haas argues: while it’s possible to work with independent contractors, this requires a lot of back and forth and often detracts from the product quality and experience. Haas’s goal in founding Bubble was to lower the technological barrier to entrepreneurs entering the market, and to make the learning curve for technological skills as quick and smooth as possible. What excites him about no-code tools, Haas says, is the possibility of “turning an ordinary individual into a programmer or a software developer.” Indeed, a staggering 40% of Bubble users have no coding background. While Haas allows that “prior experience in programming definitely helps to smooth the learning curve and reduce time to pick things up,” even users with no experience can reach full Bubble proficiency in weeks and create sophisticated applications.
No code represents a step forward in the evolution of programming: we have moved from low-level programming languages (such as Assembly, where you have to understand a specific machine language to give instructions), to abstract, high-level languages, like Python and Java (with syntax similar to that of English). Low-level languages offer granularity and flexibility, but moving to high-level programming makes it possible to develop software applications at scale in months, instead of years. Proponents of no-code take this forward, arguing that no-code innovations could reduce that period even more, from months to days. “Today even many engineers are using Bubble to build applications because it's faster and more direct,” Haas says, and he hopes to see this trend continue.
The people working to democratize AI—many of whom, we emphasize, come from non-technical backgrounds— are full of groundbreaking ideas: for example, creating a universal language for human interactions with AI. Such a language would make it far easier for people without technical training to interact and build tools with AI. We can already see this powerful trend coming to life with the OpenAI API Playground interface, which uses natural language and does not require coding skills. We believe that combining this idea with no-code applications could create a revolutionary outcome.
Haas agrees: “We view our job as defining the vocabulary that can allow you to talk to the computer." Bubble’s initial focus is developing a language that allows humans to communicate with computers about requirements, design, and other elements of programs. The second step will be to teach the computer how to use that language to interact with humans. Haas says, “Currently, you have to draw and assemble the workflow manually in Bubble in order to build something, but it would be amazing to accelerate it by typing the English description and it popping into existence for you.”
In its current state, Bubble is a visual programming interface capable of building fully functional software applications. Integrating it with Codex (which you learned about in chapter 5) will, Haas predicts, result in an interactive no-code ecosystem that can understand the context and build an application from a simple English description. “I think that’s where no-code is eventually moving,” Haas says, “but the short-term challenge is the availability of training data. We have seen Codex work with Javascript applications since there are massive public repositories of code that are supplemented with comments, notes, and everything else required for training an LLM.”
Codex seems to already have created quite a stir in the AI community. New projects as of this writing include AI2SQL, a startup that helps to generate SQL queries from plain English, automating an otherwise time-consuming process, and Writepy, which uses Codex to power a platform for learning Python and analyzing data using English.
Using no-code, you can develop applications through visual programming and drag-and-drop in an interface that smoothes the learning curve and reduces the need for any prerequisites. LLMs are capable of understanding context much as humans do, and can thus generate code with just a nudge from humans. We’re just now seeing the “initial potential” of combining them, says Haas.  “I'm pretty sure if you interview me in five years, we will be using them internally. The integration between the two will make no-code more expressive and easier to learn. It will become a bit smarter and have a sense of empathy for what users are trying to accomplish.”
You learned in Chapter 5 about GitHub Copilot. This code generation has the advantage of huge training datasets consisting of billions of lines of code in conventional programming languages like Python and Javascript. Similarly, as no-code development picks up speed and  more and more applications are created, their code will become part of the training data for a large language model. The logical connections between the visual components of no-code application logic and the generated code will serve as a vocabulary for the model training process. This vocabulary can then be fed to an LLM to generate a fully functional application with high-level textual descriptions. “It’s basically a matter of time until it becomes technically feasible,” says Haas.
Access and Model-as-a-Service
As we’ve described throughout this book, getting access to AI is becoming much easier across the board. Model-as-a-service is a burgeoning field where powerful AI models like GPT-3 are provided as a hosted service. Anyone can use that service via a simple API without worrying about collecting training data, training the model, hosting the application, and so forth.
YouTube star Kilcher told us, “I think the level of knowledge required to interact with either these models or AI in general will decrease rapidly.” Early versions of tools like TensorFlow had little documentation and were  “super cumbersome,” he explains, so  “just the level of comfort we have right now in coding is amazing.” He cites tools like the Hugging Face Hub and Gradio alongside the OpenAI API, noting that such tools offer a “separation of concerns: ‘I am not good at running the model. I'm just going to let someone else do that.’” There are potential disadvantages to model-as-a-service, however: Kilcher notes the possibility that APIs and similar tools can create a “chokepoint.”
Kilcher’s colleague Awan says he’s excited about the “freeing effect” of model-as-a-service for creators. He notes that many people struggle with writing, “whether it's because of focus or attention span or something else. But they're brilliant thinkers and will benefit from the support in communicating their thoughts” with the help of “an AI tool that can help you put words on a page.”
Awan looks forward to the future iterations of the model, especially in “mediums like music, video, graphic designers, and product designers,” whom he predicts  will “benefit symbiotically from it and push all their mediums forward in ways we simply cannot conceptualize.”
Conclusion
GPT-3 marks an important milestone in the history of AI. It is also a part of a bigger LLM trend that will continue to grow forward in the future. The revolutionary step of providing API access has created a new model-as-a-service business model.
Chapter 2 introduced you to the OpenAI Playground and showed you how to begin using it with several standard NLP tasks. You also learned about different variants of GPT-3 and how to  balance the quality of output with pricing.
Chapter 3 tied together these concepts with a template for using GPT-3 with popular  programming languages in your software applications. You also learned how to use a low-code GPT-3 sandbox to plug-and-play prompts for your use case.
The second half of the book presents a variety of use cases, from startups to enterprises. We also looked at the challenges and limitations of this technology: without great care, AI tools can amplify bias, invade privacy, and fuel the rise of low-quality digital content and misinformation.  They can also affect the environment. Fortunately, the OpenAI team and other researchers are working hard to create and deploy solutions to these problems.
The democratization of AI and the rise of no-code are encouraging signs that GPT-3 has the potential to empower ordinary people and make the world better.
All's well that ends well, dear reader. We hope you had as much fun learning about GPT-3 as we did sharing it with you. And we hope you will find it useful in your own journey to build impactful and innovative NLP products using GPT-3. We wish you the best of luck and great success!

[1] Andrej Karpathy et al., Generative Models blog post, source: https://openai.com/blog/generative-models/.
[2] Malcolm Gladwell, Outliers: The Story of Success (Little, Brown, 2008).
[3] Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakon Uszkoreit, Llion Jones, Aidan Gomez, Lukasz Kaiser, and Illia Polosukhin, “Attention Is All You Need,” Advances in Neural Information Processing Systems 30 (2017).
[4] Jay Alammar, The Illustrated Transformer blog post, source: https://jalammar.github.io/illustrated-transformer/.
[5] Jay Alammar, The Illustrated Transformer blog post, source: https://jalammar.github.io/illustrated-transformer/.
[6] Andrew Mayne, How to get better Q&A answers from GPT-3, source: https://andrewmayneblog.wordpress.com/2022/01/22/how-to-get-better-qa-answers-from-gpt-3/.
[8] For more than 200 documents, OpenAI offers a beta API.
[9] Blog post Customizing GPT-3 for Your Application, source: https://openai.com/blog/customized-gpt-3/
[10] A longstanding Internet abbreviation for “too long; didn’t read.”
[11] For a brief explanation, see this blog post by OpenAI; for a deeper dive, see the development team’s research paper.
[12] You can watch Dracula on Vimeo;  a Fable Studios blog post also offers a a behind-the-scenes overview.
[13] Shubham Saboo, blog post GPT-3 for Corporates — Is Data Privacy an Issue?, source: https://pub.towardsai.net/gpt-3-for-corporates-is-data-privacy-an-issue-92508aa30a00.
[14] Nat Friedman, blog post Introducing GitHub Copilot: your AI pair programmer, source: https://github.blog/2021-06-29-introducing-github-copilot-ai-pair-programmer/.
[15] Harri Edwards, source: https://github.com/features/copilot/
[16] The European Union’s General Data Protection Regulation requirements, prohibit companies from hiding behind illegible terms and conditions that are difficult to understand. It requires companies to clearly define their data privacy policies and make them easily accessible.
[17] Emily M. Bender, Angelina McMillan-Major, Timnit Gebru, and Shmargaret Shmitchell, “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” In Conference on Fairness, Accountability, and Transparency (FAccT ’21), March 3–10, 2021, virtual event, Canada. https://doi.org/10.1145/3442188.3445922. The fallout from this paper forced one of its coauthors, acclaimed AI ethics researcher Timnit Gebru, to leave Google.
[18] Samuel Gehman, Suchin Gururangan, Maarten Sap, Yejin Choi, and Noah A. Smith, “RealToxicityPrompts: Evaluating Neural Toxic Degeneration in Language Models,” ACL Anthology, Findings of the Association for Computational Linguistics: EMNLP 2020, https://aclanthology.org/2020.findings-emnlp.301.
[19] Abubakar Abid, Maheen Farooqi, and James Zou, “Persistent Anti-Muslim Bias in Large Language Models,” Computation and Language, January 2021, https://arxiv.org/pdf/2101.05783.pdf.
[20] ​​Perspective API is an open-source API that uses machine learning to identify "toxic" comments, making it easier to host better conversations online. It emerged from a collaborative research effort by two teams within Google: the Counter Abuse Technology team and Jigsaw, a team that explores threats to open societies.
[21]Chengcheng Shao, Giovanni Luca Ciampaglia, Onur Varol, Kai-Cheng Yang, Alessandro Flammini, and Filippo Menczer, “The spread of low-credibility content by social bots,” Nature Human Behaviour, 2018, https://www.nature.com/articles/s41562-017-0132.
[22] Onur Varol, Emilio Ferrara, Clayton A. Davis, Filippo Menczer, and Alessandro Flammini, “Font Size:
Online Human-Bot Interactions: Detection, Estimation, and Characterization,” Eleventh International AAAI Conference on Web and Social Media, 2017, https://aaai.org/ocs/index.php/ICWSM/ICWSM17/paper/view/15587.
[23] Ben Buchanan, Micah Musser, Andrew Loh, and Katerina Sedova, “Truth, Lies, and Automation: How Language Models Could Change Disinformation,” Center for Security and Emerging Technology, 2021, https://cset.georgetown.edu/wp-content/uploads/CSET-Truth-Lies-and-Automation.pdf, Table 1.
[24] Buchanan et al., “Truth, Lies, and Automation,” p. 6.
[25] Buchanan et al., “Truth, Lies, and Automation,” p. 21.
[26] Buchanan et al., “Truth, Lies, and Automation,” p. 44.
[27] Buchanan et al., “Truth, Lies, and Automation,” p. 34.
[28] Source: Patterson, David, Joseph Gonzalez, Quoc Le, Chen Liang, Lluis-Miquel Munguia, Daniel Rothchild, David So, Maud Texier, and Jeff Dean. "Carbon emissions and large neural network training." arXiv preprint arXiv:2104.10350 (2021).
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset