Mock sample for your project: Amazon Comprehend API

Integrate with "Amazon Comprehend API" from amazonaws.com in no time with Mockoon's ready to use mock sample

Amazon Comprehend

amazonaws.com

Version: 2017-11-27


Use this API in your project

Integrate third-party APIs faster by using "Amazon Comprehend API" ready-to-use mock sample. Mocking this API will allow you to start working in no time. No more accounts to create, API keys to provision, accesses to configure, unplanned downtime, just work.
Improve your integration tests by mocking third-party APIs and cover more edge cases: slow response time, random failures, etc.

Description

Amazon Comprehend is an AWS service for gaining insight into the content of documents. Use these actions to determine the topics contained in your documents, the topics they discuss, the predominant sentiment expressed in them, the predominant language used, and more.

Other APIs by amazonaws.com

Amazon CloudDirectory

Amazon Cloud Directory Amazon Cloud Directory is a component of the AWS Directory Service that simplifies the development and management of cloud-scale web, mobile, and IoT applications. This guide describes the Cloud Directory operations that you can call programmatically and includes detailed information on data types and errors. For information about Cloud Directory features, see AWS Directory Service and the Amazon Cloud Directory Developer Guide.

AWS Comprehend Medical

Amazon Comprehend Medical extracts structured information from unstructured clinical text. Use these actions to gain insight in your documents.

AWS Budgets

The AWS Budgets API enables you to use AWS Budgets to plan your service usage, service costs, and instance reservations. The API reference provides descriptions, syntax, and usage examples for each of the actions and data types for AWS Budgets. Budgets provide you with a way to see the following information: How close your plan is to your budgeted amount or to the free tier limits Your usage-to-date, including how much you've used of your Reserved Instances (RIs) Your current estimated charges from AWS, and how much your predicted usage will accrue in charges by the end of the month How much of your budget has been used AWS updates your budget status several times a day. Budgets track your unblended costs, subscriptions, refunds, and RIs. You can create the following types of budgets: Cost budgets - Plan how much you want to spend on a service. Usage budgets - Plan how much you want to use one or more services. RI utilization budgets - Define a utilization threshold, and receive alerts when your RI usage falls below that threshold. This lets you see if your RIs are unused or under-utilized. RI coverage budgets - Define a coverage threshold, and receive alerts when the number of your instance hours that are covered by RIs fall below that threshold. This lets you see how much of your instance usage is covered by a reservation. Service Endpoint The AWS Budgets API provides the following endpoint: https://budgets.amazonaws.com For information about costs that are associated with the AWS Budgets API, see AWS Cost Management Pricing.

AWS DataSync

DataSync DataSync is a managed data transfer service that makes it simpler for you to automate moving data between on-premises storage and Amazon Simple Storage Service (Amazon S3) or Amazon Elastic File System (Amazon EFS). This API interface reference for DataSync contains documentation for a programming interface that you can use to manage DataSync.

Amazon SimpleDB

Amazon SimpleDB is a web service providing the core database functions of data indexing and querying in the cloud. By offloading the time and effort associated with building and operating a web-scale database, SimpleDB provides developers the freedom to focus on application development. A traditional, clustered relational database requires a sizable upfront capital outlay, is complex to design, and often requires extensive and repetitive database administration. Amazon SimpleDB is dramatically simpler, requiring no schema, automatically indexing your data and providing a simple API for storage and access. This approach eliminates the administrative burden of data modeling, index maintenance, and performance tuning. Developers gain access to this functionality within Amazon's proven computing environment, are able to scale instantly, and pay only for what they use. Visit http://aws.amazon.com/simpledb/ for more information.

Amazon Cognito Sync

Amazon Cognito Sync Amazon Cognito Sync provides an AWS service and client library that enable cross-device syncing of application-related user data. High-level client libraries are available for both iOS and Android. You can use these libraries to persist data locally so that it's available even if the device is offline. Developer credentials don't need to be stored on the mobile device to access the service. You can use Amazon Cognito to obtain a normalized user ID and credentials. User data is persisted in a dataset that can store up to 1 MB of key-value pairs, and you can have up to 20 datasets per user identity. With Amazon Cognito Sync, the data stored for each identity is accessible only to credentials assigned to that identity. In order to use the Cognito Sync service, you need to make API calls using credentials retrieved with Amazon Cognito Identity service. If you want to use Cognito Sync in an Android or iOS application, you will probably want to make API calls via the AWS Mobile SDK. To learn more, see the Developer Guide for Android and the Developer Guide for iOS.

AWS App Mesh

App Mesh is a service mesh based on the Envoy proxy that makes it easy to monitor and control microservices. App Mesh standardizes how your microservices communicate, giving you end-to-end visibility and helping to ensure high availability for your applications. App Mesh gives you consistent visibility and network traffic controls for every microservice in an application. You can use App Mesh with Amazon Web Services Fargate, Amazon ECS, Amazon EKS, Kubernetes on Amazon Web Services, and Amazon EC2. App Mesh supports microservice applications that use service discovery naming for their components. For more information about service discovery on Amazon ECS, see Service Discovery in the Amazon Elastic Container Service Developer Guide. Kubernetes kube-dns and coredns are supported. For more information, see DNS for Services and Pods in the Kubernetes documentation.

AWS Compute Optimizer

Compute Optimizer is a service that analyzes the configuration and utilization metrics of your Amazon Web Services compute resources, such as Amazon EC2 instances, Amazon EC2 Auto Scaling groups, Lambda functions, and Amazon EBS volumes. It reports whether your resources are optimal, and generates optimization recommendations to reduce the cost and improve the performance of your workloads. Compute Optimizer also provides recent utilization metric data, in addition to projected utilization metric data for the recommendations, which you can use to evaluate which recommendation provides the best price-performance trade-off. The analysis of your usage patterns can help you decide when to move or resize your running resources, and still meet your performance and capacity requirements. For more information about Compute Optimizer, including the required permissions to use the service, see the Compute Optimizer User Guide.

AWS CodeStar connections

AWS CodeStar Connections This AWS CodeStar Connections API Reference provides descriptions and usage examples of the operations and data types for the AWS CodeStar Connections API. You can use the connections API to work with connections and installations. Connections are configurations that you use to connect AWS resources to external code repositories. Each connection is a resource that can be given to services such as CodePipeline to connect to a third-party repository such as Bitbucket. For example, you can add the connection in CodePipeline so that it triggers your pipeline when a code change is made to your third-party code repository. Each connection is named and associated with a unique ARN that is used to reference the connection. When you create a connection, the console initiates a third-party connection handshake. Installations are the apps that are used to conduct this handshake. For example, the installation for the Bitbucket provider type is the Bitbucket app. When you create a connection, you can choose an existing installation or create one. When you want to create a connection to an installed provider type such as GitHub Enterprise Server, you create a host for your connections. You can work with connections by calling: CreateConnection, which creates a uniquely named connection that can be referenced by services such as CodePipeline. DeleteConnection, which deletes the specified connection. GetConnection, which returns information about the connection, including the connection status. ListConnections, which lists the connections associated with your account. You can work with hosts by calling: CreateHost, which creates a host that represents the infrastructure where your provider is installed. DeleteHost, which deletes the specified host. GetHost, which returns information about the host, including the setup status. ListHosts, which lists the hosts associated with your account. You can work with tags in AWS CodeStar Connections by calling the following: ListTagsForResource, which gets information about AWS tags for a specified Amazon Resource Name (ARN) in AWS CodeStar Connections. TagResource, which adds or updates tags for a resource in AWS CodeStar Connections. UntagResource, which removes tags for a resource in AWS CodeStar Connections. For information about how to use AWS CodeStar Connections, see the Developer Tools User Guide.

CodeArtifact

AWS CodeArtifact is a fully managed artifact repository compatible with language-native package managers and build tools such as npm, Apache Maven, and pip. You can use CodeArtifact to share packages with development teams and pull packages. Packages can be pulled from both public and CodeArtifact repositories. You can also create an upstream relationship between a CodeArtifact repository and another repository, which effectively merges their contents from the point of view of a package manager client. AWS CodeArtifact Components Use the information in this guide to help you work with the following CodeArtifact components: Repository : A CodeArtifact repository contains a set of package versions, each of which maps to a set of assets, or files. Repositories are polyglot, so a single repository can contain packages of any supported type. Each repository exposes endpoints for fetching and publishing packages using tools like the npm CLI, the Maven CLI ( mvn ), and pip . Domain : Repositories are aggregated into a higher-level entity known as a domain. All package assets and metadata are stored in the domain, but are consumed through repositories. A given package asset, such as a Maven JAR file, is stored once per domain, no matter how many repositories it's present in. All of the assets and metadata in a domain are encrypted with the same customer master key (CMK) stored in AWS Key Management Service (AWS KMS). Each repository is a member of a single domain and can't be moved to a different domain. The domain allows organizational policy to be applied across multiple repositories, such as which accounts can access repositories in the domain, and which public repositories can be used as sources of packages. Although an organization can have multiple domains, we recommend a single production domain that contains all published artifacts so that teams can find and share packages across their organization. Package : A package is a bundle of software and the metadata required to resolve dependencies and install the software. CodeArtifact supports npm, PyPI, and Maven package formats. In CodeArtifact, a package consists of: A name (for example, webpack is the name of a popular npm package) An optional namespace (for example, @types in @types/node) A set of versions (for example, 1.0.0, 1.0.1, 1.0.2, etc.) Package-level metadata (for example, npm tags) Package version : A version of a package, such as @types/node 12.6.9. The version number format and semantics vary for different package formats. For example, npm package versions must conform to the Semantic Versioning specification. In CodeArtifact, a package version consists of the version identifier, metadata at the package version level, and a set of assets. Upstream repository : One repository is upstream of another when the package versions in it can be accessed from the repository endpoint of the downstream repository, effectively merging the contents of the two repositories from the point of view of a client. CodeArtifact allows creating an upstream relationship between two repositories. Asset : An individual file stored in CodeArtifact associated with a package version, such as an npm.tgz file or Maven POM and JAR files. CodeArtifact supports these operations: AssociateExternalConnection : Adds an existing external connection to a repository. CopyPackageVersions : Copies package versions from one repository to another repository in the same domain. CreateDomain : Creates a domain CreateRepository : Creates a CodeArtifact repository in a domain. DeleteDomain : Deletes a domain. You cannot delete a domain that contains repositories. DeleteDomainPermissionsPolicy : Deletes the resource policy that is set on a domain. DeletePackageVersions : Deletes versions of a package. After a package has been deleted, it can be republished, but its assets and metadata cannot be restored because they have been permanently removed from storage. DeleteRepository : Deletes a repository. DeleteRepositoryPermissionsPolicy : Deletes the resource policy that is set on a repository. DescribeDomain : Returns a DomainDescription object that contains information about the requested domain. DescribePackageVersion : Returns a PackageVersionDescription object that contains details about a package version. DescribeRepository : Returns a RepositoryDescription object that contains detailed information about the requested repository. DisposePackageVersions : Disposes versions of a package. A package version with the status Disposed cannot be restored because they have been permanently removed from storage. DisassociateExternalConnection : Removes an existing external connection from a repository. GetAuthorizationToken : Generates a temporary authorization token for accessing repositories in the domain. The token expires the authorization period has passed. The default authorization period is 12 hours and can be customized to any length with a maximum of 12 hours. GetDomainPermissionsPolicy : Returns the policy of a resource that is attached to the specified domain. GetPackageVersionAsset : Returns the contents of an asset that is in a package version. GetPackageVersionReadme : Gets the readme file or descriptive text for a package version. GetRepositoryEndpoint : Returns the endpoint of a repository for a specific package format. A repository has one endpoint for each package format: npm pypi maven GetRepositoryPermissionsPolicy : Returns the resource policy that is set on a repository. ListDomains : Returns a list of DomainSummary objects. Each returned DomainSummary object contains information about a domain. ListPackages : Lists the packages in a repository. ListPackageVersionAssets : Lists the assets for a given package version. ListPackageVersionDependencies : Returns a list of the direct dependencies for a package version. ListPackageVersions : Returns a list of package versions for a specified package in a repository. ListRepositories : Returns a list of repositories owned by the AWS account that called this method. ListRepositoriesInDomain : Returns a list of the repositories in a domain. PutDomainPermissionsPolicy : Attaches a resource policy to a domain. PutRepositoryPermissionsPolicy : Sets the resource policy on a repository that specifies permissions to access it. UpdatePackageVersionsStatus : Updates the status of one or more versions of a package. UpdateRepository : Updates the properties of a repository.

AWS Migration Hub

The AWS Migration Hub API methods help to obtain server and application migration status and integrate your resource-specific migration tool by providing a programmatic interface to Migration Hub. Remember that you must set your AWS Migration Hub home region before you call any of these APIs, or a HomeRegionNotSetException error will be returned. Also, you must make the API calls while in your home region.

AWS Batch

Batch Using Batch, you can run batch computing workloads on the Cloud. Batch computing is a common means for developers, scientists, and engineers to access large amounts of compute resources. Batch uses the advantages of this computing workload to remove the undifferentiated heavy lifting of configuring and managing required infrastructure. At the same time, it also adopts a familiar batch computing software approach. Given these advantages, Batch can help you to efficiently provision resources in response to jobs submitted, thus effectively helping you to eliminate capacity constraints, reduce compute costs, and deliver your results more quickly. As a fully managed service, Batch can run batch computing workloads of any scale. Batch automatically provisions compute resources and optimizes workload distribution based on the quantity and scale of your specific workloads. With Batch, there's no need to install or manage batch computing software. This means that you can focus your time and energy on analyzing results and solving your specific problems.

Other APIs in the same category

Security Center

azure.com
API spec for Microsoft.Security (Azure Security Center) resource provider

Amazon DevOps Guru

Amazon DevOps Guru is a fully managed service that helps you identify anomalous behavior in business critical operational applications. You specify the AWS resources that you want DevOps Guru to cover, then the Amazon CloudWatch metrics and AWS CloudTrail events related to those resources are analyzed. When anomalous behavior is detected, DevOps Guru creates an insight that includes recommendations, related events, and related metrics that can help you improve your operational applications. For more information, see What is Amazon DevOps Guru. You can specify 1 or 2 Amazon Simple Notification Service topics so you are notified every time a new insight is created. You can also enable DevOps Guru to generate an OpsItem in AWS Systems Manager for each insight to help you manage and track your work addressing insights. To learn about the DevOps Guru workflow, see How DevOps Guru works. To learn about DevOps Guru concepts, see Concepts in DevOps Guru.
The Amazon Braket API Reference provides information about the operations and structures supported in Amazon Braket.

Platform API

The REST API specification for Ably.

AWS Audit Manager

Welcome to the Audit Manager API reference. This guide is for developers who need detailed information about the Audit Manager API operations, data types, and errors. Audit Manager is a service that provides automated evidence collection so that you can continuously audit your Amazon Web Services usage, and assess the effectiveness of your controls to better manage risk and simplify compliance. Audit Manager provides pre-built frameworks that structure and automate assessments for a given compliance standard. Frameworks include a pre-built collection of controls with descriptions and testing procedures, which are grouped according to the requirements of the specified compliance standard or regulation. You can also customize frameworks and controls to support internal audits with unique requirements. Use the following links to get started with the Audit Manager API: Actions : An alphabetical list of all Audit Manager API operations. Data types : An alphabetical list of all Audit Manager data types. Common parameters : Parameters that all Query operations can use. Common errors : Client and server errors that all operations can return. If you're new to Audit Manager, we recommend that you review the Audit Manager User Guide.

Amazon Appflow

Welcome to the Amazon AppFlow API reference. This guide is for developers who need detailed information about the Amazon AppFlow API operations, data types, and errors. Amazon AppFlow is a fully managed integration service that enables you to securely transfer data between software as a service (SaaS) applications like Salesforce, Marketo, Slack, and ServiceNow, and Amazon Web Services like Amazon S3 and Amazon Redshift. Use the following links to get started on the Amazon AppFlow API: Actions : An alphabetical list of all Amazon AppFlow API operations. Data types : An alphabetical list of all Amazon AppFlow data types. Common parameters : Parameters that all Query operations can use. Common errors : Client and server errors that all operations can return. If you're new to Amazon AppFlow, we recommend that you review the Amazon AppFlow User Guide. Amazon AppFlow API users can use vendor-specific mechanisms for OAuth, and include applicable OAuth attributes (such as auth-code and redirecturi) with the connector-specific ConnectorProfileProperties when creating a new connector profile using Amazon AppFlow API operations. For example, Salesforce users can refer to the Authorize Apps with OAuth documentation.

Synthetics

Amazon CloudWatch Synthetics You can use Amazon CloudWatch Synthetics to continually monitor your services. You can create and manage canaries, which are modular, lightweight scripts that monitor your endpoints and APIs from the outside-in. You can set up your canaries to run 24 hours a day, once per minute. The canaries help you check the availability and latency of your web services and troubleshoot anomalies by investigating load time data, screenshots of the UI, logs, and metrics. The canaries seamlessly integrate with CloudWatch ServiceLens to help you trace the causes of impacted nodes in your applications. For more information, see Using ServiceLens to Monitor the Health of Your Applications in the Amazon CloudWatch User Guide. Before you create and manage canaries, be aware of the security considerations. For more information, see Security Considerations for Synthetics Canaries.

AWS Server Migration Service

AWS Server Migration Service AWS Server Migration Service (AWS SMS) makes it easier and faster for you to migrate your on-premises workloads to AWS. To learn more about AWS SMS, see the following resources: AWS Server Migration Service product page AWS Server Migration Service User Guide

Amazon Comprehend

Amazon Comprehend is an AWS service for gaining insight into the content of documents. Use these actions to determine the topics contained in your documents, the topics they discuss, the predominant sentiment expressed in them, the predominant language used, and more.

AppPlatformManagementClient

azure.com
REST API for Azure Spring Cloud

Amazon Timestream Write

Amazon Timestream is a fast, scalable, fully managed time series database service that makes it easy to store and analyze trillions of time series data points per day. With Timestream, you can easily store and analyze IoT sensor data to derive insights from your IoT applications. You can analyze industrial telemetry to streamline equipment management and maintenance. You can also store and analyze log data and metrics to improve the performance and availability of your applications. Timestream is built from the ground up to effectively ingest, process, and store time series data. It organizes data to optimize query processing. It automatically scales based on the volume of data ingested and on the query volume to ensure you receive optimal performance while inserting and querying data. As your data grows over time, Timestream’s adaptive query processing engine spans across storage tiers to provide fast analysis while reducing costs.

AWS Marketplace Catalog Service

Catalog API actions allow you to manage your entities through list, describe, and update capabilities. An entity can be a product or an offer on AWS Marketplace. You can automate your entity update process by integrating the AWS Marketplace Catalog API with your AWS Marketplace product build or deployment pipelines. You can also create your own applications on top of the Catalog API to manage your products on AWS Marketplace.