Table of Contents

Driven User Guide

version 1.2

Application Views

As business demand for data analytics increases, so does the demand to retrieve application execution data and telemetry information punctually for statistical reporting, forecasting, resource allocation, and other purposes.

After filtering data in Driven such that the metrics, time range, and other dimensions represent the application type that you want to explore in depth, save the search and filter parameters as an Application View. Each Application View shows the current and past execution details of applications that match the view criteria. The Application Runtime graphs in these views, together with tabular data under the graphs, provide insights to application behavior that can affect compliance with service-level agreement (SLA) requirements.

app-runtime-graph-1 Figure 1: Application Runtime graph and slider

Main Insights from Application Views

In most cases you want to compare or observe execution instances of the same application in the Application View to check if and how the app is deviating from SLAs or other benchmarks. Visualizing too many applications can limit the usefulness of the Application Runtime graph. Also, the Application View is suited for apps that run periodically, not for apps that run ad hoc.

The Application Runtime graph helps you inspect and compare application runs in a dynamic, visual representation of the following metrics:

  • Application execution status: Each colored line in the graph represents an instance of an application run. The color of the line reflects the status of the application run.

  • Application run duration: The degree to which a colored line is sloped represents the duration of the application run. Where an application run touches the top horizontal line represents the start date/time of the run, while the point touching the bottom horizontal line represents the ending date/time. Therefore, a steeper slope in the app run line indicates a longer duration. Hover over the top or bottom of the application run line to view the date/timestamp of the start or ending points respectively.

  • Historical application performance: While viewing the application runs for a time period, you can overlay the graph with data from runs of the same application prior to the period. For example, if your graph is set to focus on the period of 1 week, you can click the Show Historical Runs option to display the application run information of the previous week. The historical runs appear as gray lines without redrawing the chronology parameters of the graph. The graphic rendering enables you to compare how an application performed at repeated time intervals in a unified view.

You can only view the historical runs of the application when the Framed to Interval option under the graph is selected. See Time Ranges below for more information about time intervals.

Information about each application run is also displayed in the table under the graph. (Use the arrowheads directly above the column headings to page through a table that contains more rows than fit on the page.) Further details about application runs can be displayed in the table by using the column chooser.

Variances in the runtimes of a particular application can indicate that there are issues requiring attention. The linear graphic representation of application runs is one way that Driven facilitates this type of troubleshooting. By scanning the graph to see if runs of the same application are sloped differently relative to one another, you can spot whether further investigation into persistence and reliability is needed.

You can also detect whether other anomalies are occurring when applications execute. The information about these anomalies can help uncover existing or potential problems. Table 1 lists a few types of anomalous application behavior along with some possible causes to consider. The information in the table is not exhaustive and not applicable to every situation.

Table 1. Possible Causes for Anomalous Application Behavior
Unexpected Behavior Possible Causes

App started as expected, but finishes later than expected

1) Cluster overload; 2) App processed exceptionally large dataset

App started late, but finished on time

1) Dependency on another app that finished late; 2) Cluster overload

App not executed

1) Dependency on another app that did not complete successfully; 2) Scheduling error

Controlling Application Runtime Views

Time Ranges

There are several ways to change the time range that is displayed in the Application Runtime graph:

  • "Timeline Brush": As you hover the pointer over the graph, the pointer changes to a cross-haired icon.

    • Click and drag the icon to select a time range in the graph. The brush only allows you to select a period within the graphed time range.

    • Single-click on an application run line to view the starting and ending time.

    • Double-click on an application run to exit the Application Runtime view and open App Details.

  • Slider: If you hover over the start and end dates of the shaded bar on the slider, the pointer changes to arrowheads. Click and drag to change the dates.

  • Standard Time Intervals: To view application run metrics in the graph by standard units of time, the Framed to Interval option must be selected. You can then specify unit of time (calendar month, week, day, or hour) in the Time Interval field. Click the arrowheads next to the interval setting fields to move the graph time frame.

Changing the date/time range with the timeline brush or the slider sets the Time Interval to Custom, which disables the Show Historical Runs option. If historical data is important for your discovery purposes, consider browsing the time span with the arrowheads next to the Time Interval field when it is set to Month, Week, Day, or Hour.

app-runtime-graph-2 Figure 2: Single-click on an application execution instance to view some time and duration details

Fine-Tuning Graphic Details

Click Show Guide Lines if you want the graph to display vertical lines that demarcate the start and end of whatever time unit is selected in the Guide Line Period field.

Click Show State Transitions if you want the color of an application execution line to change color when the status of the application changes in the duration of time that is represented from top horizontal axis to bottom horizontal axis. The legend for application status colors appears in the top right corner. By showing state transitions, you can pinpoint exactly when an application changes status. For example, if the line representing an application execution is light blue on the top part but is yellow on the bottom part, you can see what date and time the application changed from running to stopped status.

Tabular Data

The table on an Application View page provides a detailed breakdown of application execution data on the flow level. Use the table to drill down and gain insights to application performance from your cluster. Some key monitoring assets of the tabular interface include the following capabilities:

  • Export application-level tabular data to a tab-separated values text file

  • Add or remove metrics that are displayed

  • Click on a hyperlinked flow name to view processing data on more granular levels, including visualization of flows and steps as directed acyclic graphs (DAGs)

The Driven page displays a maximum of 25 rows. Use the pagination arrows to navigate a table that spans more than one page.


The columns can be arranged by drag and drop to your preferred order after clicking on a column heading. You can also sort the items within a column by clicking on the bidirectional arrow.

Exporting Data to a .tsv File

As part of your analytical process, the application data that is presented in a Driven table can be downloaded as a tab-separated values (.tsv) file, which then can populate a spreadsheet for detecting patterns, metrics, and usage.

Click the download icon tsv_icon to capture the Driven table data and download it to a file.

Track Cascading Applications by Various Metrics

Driven lets you customize most of the information that the table displays. Click the column chooser icon Counter_Chooser to reveal or conceal columnar metrics. The Status and Name columns cannot be hidden.

The columnar metrics are categorized in the column chooser. Each category can be collapsed or expanded.

A key feature of the table and column chooser is the ability to import and view counter attributes. See Counter Data and Other Metrics in Tables for more information.