Docs
DataCrunch HomeSDKAPILogin / Signup
  • Welcome to DataCrunch
    • Overview
    • Locations and Sustainability
    • Support
  • GPU Instances
    • Set up a GPU instance
    • Securing Your Instance
    • Shutdown, Hibernate, and Delete
    • Adding a New User
    • Block Volumes
    • Shared Filesystems (SFS)
    • Managing SSH Keys
    • Connecting to Your DataCrunch.io Server
    • Connecting to Jupyter notebook with VS Code
    • Team Projects
    • Pricing and Billing
  • Clusters
    • Instant Clusters
      • Deploying a GPU cluster
      • Slurm
      • Spack
      • Good to know
    • Customized GPU clusters
  • Containers
    • Overview
    • Container Registries
    • Scaling and health-checks
    • Batching and Streaming
    • Async Inference
    • Tutorials
      • Quick: Deploy with vLLM
      • In-Depth: Deploy with TGI
      • In-Depth: Deploy with SGLang
      • In-Depth: Deploy with vLLM
      • In-Depth: Deploy with Replicate Cog
      • In-Depth: Asynchronous Inference Requests with Whisper
  • Inference
    • Overview
    • Authorization
    • Audio Models
      • Whisper X
  • Pricing and Billing
  • Resources
    • Resources Overview
    • DataCrunch API
  • Python SDK
  • Get Free Compute Credits
Powered by GitBook
On this page
  • Overview
  • Obtaining a Bearer Token
  • Steps to Generate a Bearer Token
  • Using the Bearer Token
  • Security Notes

Was this helpful?

  1. Inference

Authorization

Overview

Our API uses bearer token authentication for securing access and ensuring that only authorized users can perform certain operations. This authentication method is particularly implemented for all our inference endpoint APIs, making the bearer token universally applicable across these services. This approach simplifies the authentication process for our REST API by avoiding session-based authentication and providing stateless communication.

Obtaining a Bearer Token

To use our Inference API you will first need to obtain a bearer token. Bearer tokens are unique identifiers that ensure secure access to our API endpoints.

Steps to Generate a Bearer Token

  1. Go to Keys -> Inference API Keys

  2. Click on Create button.

  3. Once generated, your API key will serve as your bearer token and is valid for all inference endpoint APIs.

Using the Bearer Token

After obtaining your bearer token, you can use it to authenticate your requests to any of the inference endpoint APIs. Each request to our API should include the bearer token in the Authorization header.

Security Notes

  • Keep your bearer token secure and never expose it in client-side code.

  • If you suspect that your token has been compromised, regenerate a new token immediately.

Last updated 6 months ago

Was this helpful?