Mock sample for your project: Cosmos DB API

Integrate with "Cosmos DB API" from azure.com in no time with Mockoon's ready to use mock sample

Cosmos DB

azure.com

Version: 2019-08-01-preview


Use this API in your project

Speed up your application development by using "Cosmos DB API" ready-to-use mock sample. Mocking this API will allow you to start working in no time. No more accounts to create, API keys to provision, accesses to configure, unplanned downtime, just work.
It also improves your integration tests' quality and reliability by accounting for random failures, slow response time, etc.

Description

Azure Cosmos DB Database Service Resource Provider REST API

Other APIs by azure.com

Azure Log Analytics

azure.com
Azure Log Analytics API reference

CertificateRegistrationProvider API Client

azure.com

PolicyClient

azure.com
To manage and control access to your resources, you can define customized policies and assign them at a scope.

Azure Log Analytics

azure.com
Azure Log Analytics API reference

iotDpsClient

azure.com
API for using the Azure IoT Hub Device Provisioning Service features.

Power BI Embedded Management Client

azure.com
Client to manage your Power BI Embedded workspace collections and retrieve workspaces.

Azure Reservation

azure.com
Microsoft Azure Quota Resource Provider.

Mixed Reality

azure.com
Mixed Reality Resource Provider Remote Rendering Resource API

SqlManagementClient

azure.com
The Azure SQL Database management API provides a RESTful set of web APIs that interact with Azure SQL Database services to manage your databases. The API enables users to create, retrieve, update, and delete databases, servers, and other entities.

PostgreSQLManagementClient

azure.com
The Microsoft Azure management API provides create, read, update, and delete functionality for Azure PostgreSQL resources including servers, databases, firewall rules, VNET rules, security alert policies, log files and configurations with new business model.

Service Map

azure.com
Service Map API Reference

Azure SQL Database Import/Export spec

azure.com
Provides create and read functionality for Import/Export operations on Azure SQL databases.

Other APIs in the same category

DataLakeStoreFileSystemManagementClient

azure.com
Creates an Azure Data Lake Store filesystem client.

ApiManagementClient

azure.com
Use these REST APIs for getting the network connectivity status of your Azure API Management deployment. When the API Management service is deployed inside a Virtual Network, it needs to have access to other Azure resources it depends on. This also gives details about the DNS Servers visible to Azure API Management deployment.

AWS IoT Events Data

AWS IoT Events monitors your equipment or device fleets for failures or changes in operation, and triggers actions when such events occur. You can use AWS IoT Events Data API commands to send inputs to detectors, list detectors, and view or update a detector's status. For more information, see What is AWS IoT Events? in the AWS IoT Events Developer Guide.

AWS Data Pipeline

AWS Data Pipeline configures and manages a data-driven workflow called a pipeline. AWS Data Pipeline handles the details of scheduling and ensuring that data dependencies are met so that your application can focus on processing the data. AWS Data Pipeline provides a JAR implementation of a task runner called AWS Data Pipeline Task Runner. AWS Data Pipeline Task Runner provides logic for common data management scenarios, such as performing database queries and running data analysis using Amazon Elastic MapReduce (Amazon EMR). You can use AWS Data Pipeline Task Runner as your task runner, or you can write your own task runner to provide custom data management. AWS Data Pipeline implements two main sets of functionality. Use the first set to create a pipeline and define data sources, schedules, dependencies, and the transforms to be performed on the data. Use the second set in your task runner application to receive the next task ready for processing. The logic for performing the task, such as querying the data, running data analysis, or converting the data from one format to another, is contained within the task runner. The task runner performs the task assigned to it by the web service, reporting progress to the web service as it does so. When the task is done, the task runner reports the final success or failure of the task to the web service.

AWS Organizations

AWS Organizations is a web service that enables you to consolidate your multiple AWS accounts into an organization and centrally manage your accounts and their resources. This guide provides descriptions of the Organizations operations. For more information about using this service, see the AWS Organizations User Guide. Support and feedback for AWS Organizations We welcome your feedback. Send your comments to [email protected] or post your feedback and questions in the AWS Organizations support forum. For more information about the AWS support forums, see Forums Help. Endpoint to call When using the AWS CLI or the AWS SDK For the current release of Organizations, specify the us-east-1 region for all AWS API and AWS CLI calls made from the commercial AWS Regions outside of China. If calling from one of the AWS Regions in China, then specify cn-northwest-1. You can do this in the AWS CLI by using these parameters and commands: Use the following parameter with each command to specify both the endpoint and its region: --endpoint-url https://organizations.us-east-1.amazonaws.com (from commercial AWS Regions outside of China) or --endpoint-url https://organizations.cn-northwest-1.amazonaws.com.cn (from AWS Regions in China) Use the default endpoint, but configure your default region with this command: aws configure set default.region us-east-1 (from commercial AWS Regions outside of China) or aws configure set default.region cn-northwest-1 (from AWS Regions in China) Use the following parameter with each command to specify the endpoint: --region us-east-1 (from commercial AWS Regions outside of China) or --region cn-northwest-1 (from AWS Regions in China) Recording API Requests AWS Organizations supports AWS CloudTrail, a service that records AWS API calls for your AWS account and delivers log files to an Amazon S3 bucket. By using information collected by AWS CloudTrail, you can determine which requests the Organizations service received, who made the request and when, and so on. For more about AWS Organizations and its support for AWS CloudTrail, see Logging AWS Organizations Events with AWS CloudTrail in the AWS Organizations User Guide. To learn more about AWS CloudTrail, including how to turn it on and find your log files, see the AWS CloudTrail User Guide.

Amazon EC2 Container Registry

Amazon Elastic Container Registry Amazon Elastic Container Registry (Amazon ECR) is a managed container image registry service. Customers can use the familiar Docker CLI, or their preferred client, to push, pull, and manage images. Amazon ECR provides a secure, scalable, and reliable registry for your Docker or Open Container Initiative (OCI) images. Amazon ECR supports private repositories with resource-based permissions using IAM so that specific users or Amazon EC2 instances can access repositories and images. Amazon ECR has service endpoints in each supported Region. For more information, see Amazon ECR endpoints in the Amazon Web Services General Reference.
Amazon MQ is a managed message broker service for Apache ActiveMQ and RabbitMQ that makes it easy to set up and operate message brokers in the cloud. A message broker allows software applications and components to communicate using various programming languages, operating systems, and formal messaging protocols.

Access Analyzer

Identity and Access Management Access Analyzer helps identify potential resource-access risks by enabling you to identify any policies that grant access to an external principal. It does this by using logic-based reasoning to analyze resource-based policies in your Amazon Web Services environment. An external principal can be another Amazon Web Services account, a root user, an IAM user or role, a federated user, an Amazon Web Services service, or an anonymous user. You can also use IAM Access Analyzer to preview and validate public and cross-account access to your resources before deploying permissions changes. This guide describes the Identity and Access Management Access Analyzer operations that you can call programmatically. For general information about IAM Access Analyzer, see Identity and Access Management Access Analyzer in the IAM User Guide. To start using IAM Access Analyzer, you first need to create an analyzer.

Amazon SimpleDB

Amazon SimpleDB is a web service providing the core database functions of data indexing and querying in the cloud. By offloading the time and effort associated with building and operating a web-scale database, SimpleDB provides developers the freedom to focus on application development. A traditional, clustered relational database requires a sizable upfront capital outlay, is complex to design, and often requires extensive and repetitive database administration. Amazon SimpleDB is dramatically simpler, requiring no schema, automatically indexing your data and providing a simple API for storage and access. This approach eliminates the administrative burden of data modeling, index maintenance, and performance tuning. Developers gain access to this functionality within Amazon's proven computing environment, are able to scale instantly, and pay only for what they use. Visit http://aws.amazon.com/simpledb/ for more information.

AWS Batch

Batch Using Batch, you can run batch computing workloads on the Cloud. Batch computing is a common means for developers, scientists, and engineers to access large amounts of compute resources. Batch uses the advantages of this computing workload to remove the undifferentiated heavy lifting of configuring and managing required infrastructure. At the same time, it also adopts a familiar batch computing software approach. Given these advantages, Batch can help you to efficiently provision resources in response to jobs submitted, thus effectively helping you to eliminate capacity constraints, reduce compute costs, and deliver your results more quickly. As a fully managed service, Batch can run batch computing workloads of any scale. Batch automatically provisions compute resources and optimizes workload distribution based on the quantity and scale of your specific workloads. With Batch, there's no need to install or manage batch computing software. This means that you can focus your time and energy on analyzing results and solving your specific problems.

Service Quotas

With Service Quotas, you can view and manage your quotas easily as your AWS workloads grow. Quotas, also referred to as limits, are the maximum number of resources that you can create in your AWS account. For more information, see the Service Quotas User Guide.

AWS IoT Data Plane

IoT data IoT data enables secure, bi-directional communication between Internet-connected things (such as sensors, actuators, embedded devices, or smart appliances) and the Amazon Web Services cloud. It implements a broker for applications and things to publish messages over HTTP (Publish) and retrieve, update, and delete shadows. A shadow is a persistent representation of your things and their state in the Amazon Web Services cloud. Find the endpoint address for actions in IoT data by running this CLI command: aws iot describe-endpoint --endpoint-type iot:Data-ATS The service name used by Amazon Web ServicesSignature Version 4 to sign requests is: iotdevicegateway.