Here at Nuvalence, we’re really excited about OpenAI’s new ChatGPT Plugins capability. When we got the opportunity to try it out for ourselves, we dove right in and started developing plugins of our own.

Along the way, we learned a lot about the simplicity of the tool, but we also uncovered a lot of insights around areas of complexity, too. Authentication is one of those complex areas we wanted to understand on a deeper level. 

In this post, we’ll delve into the specifics of each authentication method, guiding developers to choose the best option for their use case. We will also cover some important best practices and give examples to help illustrate. Let’s dig in!

It’s Never Too Early To Think About Authentication

In our first post in this series, we predicted that ChatGPT plugins could revolutionize the way businesses think about their software development investments. We kept that idea in mind as we explored the capability and checked out our fellow engineers’ plugins. 

One thing we noticed was that the first crop of plugins are generally operating unauthenticated or with service-level authentication. Some leverage publicly-available information, either via APIs or publicly-shared documents via Google Drive, while others sidestep the issue by providing links that redirect users to account-driven sites. With ChatGPT plugins still in their infancy, this is a logical starting point and makes a lot of sense. 

But if plugins really catch on, businesses will begin to want to combine their own data with ChatGPT for more powerful, personalized results. And with that kind of sophistication comes the need for robust security through authentication.

Flexible Authentication with ChatGPT Plugins

ChatGPT’s plugin authentication system has a number of options designed to cater to a diverse array of use cases. The variety of authentication options offered—unauthenticated, service level, user level, and OAuth-based—each cater to different security requirements and user experiences. Regardless of the authentication scheme chosen, the configuration takes place in the Manifest File (ai-plugin.json). 

This file, familiar to you if you’ve read the second post in our series, is a JSON file containing general information about the plugin you are developing. It includes details such as the plugin’s name and description, API hosting information, and, importantly, it’s where you configure authorization.

"auth": {
  "type": "oauth",
  "client_url": "https://my_server.com/authorize",
  "scope": "",
  "authorization_url": "https://my_server.com/token",
  "authorization_content_type": "application/json",
  "verification_tokens": {
    "openai": "abc123456"
  }
}

The snippet above demonstrates an example of an OAuth implementation. All schemes necessitate a type (none, user_http, service_http, or oauth). For the unauthenticated scheme, type is the only property required. Regardless, the setup itself is pretty straightforward. What demands more attention is choosing which authentication scheme is best for your use case. 

Choosing the Right Authentication

When you choose your authentication type, it’s important to consider not just your current use case, but future use cases as well. Our team gave this choice a lot of thought during our experiments and testing, and my colleague Alexander Jettel applied what we learned when he developed his popular ChatGPT plugin, VideoInsights.  

Unauthenticated (none)

Alexander chose to use this scheme for the VideoInsights plugin, a method ideal for applications that leverage open APIs without the need for access restrictions. This method empowers users to directly send requests to the API sans barriers, making it an optimal choice for APIs intended to be freely accessible. This decision was perfect for VideoInsights, as it enables any user to swiftly receive a video summary from publicly-accessible services.

However, should VideoInsights have been designed to summarize a user’s private videos, an appropriate authorization scheme would have been necessary to securely access private data while preserving its integrity and privacy.

Consideration should also be given to the potential need for authorization to prevent misuse and manage costs, as each call to VideoInsights entails a call to ChatGPT. Unrestricted free usage could lead to escalating costs. Authorization would permit us to meter and limit usage per customer, thus controlling costs, but it’s worth noting that this would introduce a usage barrier.

Should Alexander decide to extend VideoInsights’ reach, he might employ role-based access to facilitate free and paid tiers. The future of VideoInsights is yet to be fully explored; we are only beginning to understand the potential of combining data and existing APIs with intelligence to develop new products and services. As we venture forward, ensuring secure connections will become increasingly vital.

If Alexander chose to incorporate authorization into VideoInsights, OpenAI provides several options: user-level, service-level, and OAuth-based authentication. OpenAI’s documentation outlines the advantages and disadvantages of each authentication scheme, but let me highlight some key considerations.

User-level authentication (user_http)

While user-level authentication is an option, it is not recommended by OpenAI. This method currently requires the user to input their personal API token into the ChatGPT prompt window during installation. For many users, this might be unnerving or seem like a security risk. Finally, this option is not supported in the plugin store. Only use this option if you are developing an “unverified app” that you will have to distribute manually.

Service-level authentication (service_http)

Service-level authentication, on the other hand, provides secure communication between a plugin and a backend API. OpenAI favors this authentication method for its simplicity and security. Setup is pretty straightforward. You need to supply a client token when registering the plugin and add the verification token that is returned to the Manifest. That said, if you are looking for any kind of granularity or to control individual access, this is not the authentication you are looking for. But if you have secured API endpoints that you want to connect to ChatGPT for public consumption, this is the way to go.

OAuth (oath)

If your application requires fine-grained control and security and you value your user experience, OAuth is the solution for plugins. This is especially true for applications handling sensitive user data or integrations with other platforms. It has the advantage of being an industry standard and has a well-understood implementation pattern. Finally, most users come to expect an easy login to third-party websites through their Google or Facebook accounts. As the ChatGPT plugin ecosystem matures, I believe we will see many of the plugins in the store utilizing OAuth.

Conclusion

Overall, we found that ChatGPT is really well-positioned to ensure security of their plugins across a wide variety of use cases – from public tools that present minimal risk to the most sensitive and mission-critical scenarios. Authentication is always a complex aspect of software development, and we were glad to see that OpenAI made it easier for us to tame that complexity and make the right choice for our own plugin.

So far in this series, we’ve explored the highest-level observations and have also gotten into the weeds on plugin development and authentication. In the next post of our series, my colleague Alexander Jettel will look back on his experience developing a popular plugin that’s been adopted by many members of the ChatGPT Plus community.

This is the third in a series of posts documenting our experiments with OpenAI’s ChatGPT. You can read about our overall impressions and our deep dive into building a plugin in the earlier posts in this series.

Nuvalence can help you unlock the possibilities of AI to propel your business forward.

LET’S TALK