Microsoft Build
OpenAI President Greg Brockman and Microsoft CTO Kevin Scott Microsoft Build
OpenAI President Greg Brockman (l) and Microsoft CTO Kevin Scott

Microsoft Build: How AI as Copilot Gets Developers to Think Intuitively

At Microsoft Build 2023, Microsoft and OpenAI executives discuss the benefits of generative AI serving as a "copilot" to developers.

Generative AI and AI in general can act as a "copilot" for various professionals by assisting and augmenting their work processes, operating alongside professionals, providing support, automating certain tasks, enhancing productivity, and offering insights.

In the case of a software developer, for example, generative AI could serve as a copilot with code generation and completion — AI can assist software developers by generating code snippets or completing partially written code.

The benefits of AI acting as a copilot for software developers include:

  • increased productivity
  • reduced repetitive tasks
  • improved code quality
  • enhanced debugging capabilities
  • accelerated development cycles
  • better utilization of developer expertise

At Microsoft Build, the software giant's annual flagship event for developers taking place this week, Microsoft's chief technology officer and executive vice president of AI, Kevin Scott, and Greg Brockman, president and co-founder of ChatGPT maker OpenAI, which Microsoft has invested billions of dollars in, spoke at length in a session titled "The Era of the AI Copilot" about the potential for AI to serve in this copilot capacity.

Brockman called the development of generative AI plugins an "amazing opportunity" for developers to leverage this technology in a way that just makes the system better for everyone and noted that ChatGPT was designed on an open standard.

"I think that this kind of core design principle of really having any developer who wants be able to plug in and get the power of the system and be able to bring all the power of any domain into ChatGPT is really, really amazing," he said.

He added that OpenAI is still in the early stages of "really pushing" GPT-4's capabilities and that the technology is "clearly" getting better and better.

"The thing that I think every developer can do that is hard for us — and even Microsoft at Microsoft's scale — to do is to really go into specific domains and figure out how to make this technology work there," Brockman said.

Scott pointed out that ChatGPT fits into Microsoft's larger strategy of generative AI as a copilot, from integration with Bing Chat, GitHub Copilot, Microsoft Security Copilot, Microsoft 365 Copilot, and Designer, among others.

"The thing that we noticed as we were building these copilots, starting with GitHub Copilot several years ago, is that the idea of a copilot is actually pretty general," he explained. "So, this notion that you're going to have a multi-turn conversational agent-like interface on your software that helps you do cognitively complex things applies to more than just helping someone do software development."

Building an AI Copilot

Scott explained that the first step to building a successful AI copilot requires an understanding of what the unmet user need is and then applying the technology required to solve that problem.

"One thing in particular that you have to really bear in mind is the model is not your product," he said. "Unless you are an infrastructure company, the model itself is just infrastructure that is enabling your product. It isn't the thing in and of itself."

A focus on end-user experience is critical, Scott explained, including homing in on user interface elements, trying to fully anticipate the needs of the user, and architecting applications in a way that gives people intuitive access to the full functionality and capabilities built into the code.

"With a copilot you're going to spend less time trying to second-guess the user about what it is they want because they have this really natural mechanism to express what it is they want, natural language," he said. "What you have to think about in the design of these copilots is what it is you want the copilot to actually be capable of."

What an AI Copilot Should Not Do

Scott added that on the flip side of that, developers must think about what they want the copilot not to do.

"This is important in how you're thinking about safety, but also because the thing at the bottom of the stack, these foundation models are sort of like a big bucket of unrestrained capability," he said. "You're the one who oftentimes has to restrain it to your particular domain."

Scott pointed to Microsoft's focus on AI safety, noting that it's the first step taken when thinking about building copilots.

"We think about it at every step of the process," he said. "We're giving you all some amazing tools to go build really safe, responsible AI applications."

About the author

Nathan Eddy headshotNathan Eddy is a freelance writer for ITPro Today. He has written for Popular Mechanics, Sales & Marketing Management Magazine, FierceMarkets, and CRN, among others. In 2012 he made his first documentary film, The Absent Column. He currently lives in Berlin.
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish