What's included?
Dashboard 1
Databricks: Spark - Overview v2
Documentation 1
Alerts 0
This quickstart doesn't include any alerts. Do you think it should?
You can edit this quickstart to add helpful components. View the repository and open a pull request.
Databricks is an orchestration platform for Apache Spark. Instantly monitor Databricks Spark applications with our New Relic Spark integration quickstart. Our integration provides a script run in a notebook to generate an installation script, which you can attach to a cluster and populate Spark metrics to New relic Insights events. Easily track the health of your Databricks clusters, fine-tune your Spark jobs for peak performance, and troubleshoot problems with this quickstart.
Databricks cluster’s driver node runs each job in scheduled stages. Individual stages are broken down into tasks and distributed across executor nodes. Our New Relic Spark integration collects detailed job and stage metrics so you can get granular insight into job performance at a glance. For example , break down the Job metric by status (successful, pending, or failed) to see in real-time if a high number of jobs are failing, which could indicate a code error or memory issue at the executor level. Metrics on the number of jobs in realtime can also help you make decisions for provisioning clusters in the future.
How to use this quickstart
Authors
New Relic Labs
Support
Need help? Visit our community forum, the Explorers Hub to find an answer or post a question.