Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

azureml-contrib-pipeline-steps

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

azureml-contrib-pipeline-steps

Azure Machine Learning Parallel Run Step - Deprecated

  • 1.59.0
  • PyPI
  • Socket score

Maintainers
1

Note

This package has been deprecated and moved to azureml-pipeline-steps. Please refer to this documentation for more information.

Azure Machine Learning Batch Inference

Azure Machine Learning Batch Inference targets large inference jobs that are not time-sensitive. Batch Inference provides cost-effective inference compute scaling, with unparalleled throughput for asynchronous applications. It is optimized for high-throughput, fire-and-forget inference over large collections of data.

Getting Started with Batch Inference Public Preview

Batch inference public preview offers a platform in which to do large inference or generic parallel map-style operations. Please visit Azure Machine Learning Notebooks to find tutorials on how to leverage this service.

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc