Skip to content

Version 2.3 - January 2021

Venari 2.3 Adds a powerful new proxy intercept capability to enable deep integrations with QA and security tools that generate HTTP traffic. Intercept mode provides a traffic export that can be imported into Venari HTTP collections and/or templated definitions. We have also added a flexible licensing model that allows users of the DevOps edition to easily convert scan job nodes into instances of desktop Ultimate Edition. This conversion can go in either direction. New configuration UI allows for finer-grained control over job scope and limiting and there are a number of usability and readability enhancements. Version 2.3 expands the set of vulnerability detections via new analysis engine updates and rules.

See the list below for the full set of new features and enhancements.

Highlights

Advanced Troubleshooting Features

Proxy Intercept Mode

Venari's local server can now operate as an HTTP proxy and can capture HTTP request/response streams. These captured streams can be imported as traffic collections or templated traffic definitions and can be used for scanning.

Capturing

Begin a non-targeted proxy capture by following these steps.

  • Click the Intercept tab on the left of the UI home page.
  • Click the Start button in the upper left corner. The proxy runs on a randomly selected available port.
  • Click the browser button that just appeared to the right of the settings button. This will launch a Chromium browser that is connected to the proxy on the selected port
  • As you browse, all of the HTTP traffic from the browser is captured into a special traffic database container. Observe that the traffic UI is identical to the standard HTTP panel seen in the scan UI.
  • Hit the refresh button to see an expandable tree of origins.

The screens below show the steps to start a proxy intercept with capture

The animation below shows a capture in progress using the connected Chromium browser

Curated Collection via Simple Rules

By default, the proxy intercepts all traffic and saves all requests and responses to the captured data. You can use rules to apply whitelist and blacklist filtering for fine-grained control over what is collected in the database. The rules can be very simple or can be complex aggregates if the need arises. In addition to filter rules, there are settings for explicit proxy port and optional client certificates.

Simple Rule (Drop any host name containing 'google')

Compound Rule (Exclude Search Engines)

Exporting

Captured proxy traffic can be exported in several formats as shown in the animation below. The Traffic XML format can be imported into a traffic collection or a templated traffic definition. More on that feature here


Flex Licensing

Venari DevOps edition users can convert job node licenses into Ultimate edition licenses by specifying the master node URL and claiming an ultimate edition license. These flex license conversions work in the other direction as well so the Ultimate license can be returned to the job node pool.

The conversions to/from Ultimate Edition are transient and the total scanner count is fixed. This simple mechanism allows flexible deployments of 'scan nodes' into scan farms under the control of an orchestrator or into desktop versions.


Interactive Login with Multi-factor Authentication

Venari has a new login type for handling multi-factor authentication flows. Interactive login mode launches a browser and allows the user to manually navigate and input all authentication data, including MFA tokens if required. The final state of the browser is collected into the Venari state manager and the scan continues automatically from that point.
For scans that cannot be fully automated due to MFA challenges, interactive login allows for scans to be semi-automated by alerting the UI user when/if subsequent logins are required.

Interactive login is configurable on the home screen when new applications are created and can be set in specific job templates in the start tab.


Custom Error Page (NOT FOUND) Configuration

Many of Venari's analysis algorithms depend on accurately determining if a requested resource (page) is 'found.' If the application under test does not use true 404 status codes to indicate the NOT FOUND condition, then the analysis engines must use alternative detection schemes. There are built in comparison techniques to detect custom error pages or redirect behavior that indicates NOT FOUND. Version 2.3 introduces UI configuration to specify custom signatures to detect the NOT FOUND condition.

Custom Signatures

There are built-in signatures for common patterns, and you can create custom signatures as well. The example below shows a signature that can be efficiently detected with a simple rule.

Note that the error text is inside an H1 element's inner text.


Multiple Window Support

Version 2.3 features a button that creates new instances of the Venari UI. The UI is an electron application that pulls data and sends commands to a locally running job server (node). The local server node exposes all Venari capabilities through REST API micro-services. This feature demonstrates the core Venari architecture that is designed for DevOps and desktop use cases.

The screenshot below shows 2 instances side by side. The left screen is the summary view, and the right screen shows the user drilling into the findings view to inspect the XSS vulnerability evidence more closely. This is a common triage use case, and it is now easier with multi-window support.


Client Certificate Support

Client certificate configuration has been added to both the internal HTTP requestor and the pool of headless browsers. This configuration is available in the HTTP tab of the template editor UI.

The screenshot below shows the certificate configuration UI panel. The host pattern is an optional wildcard pattern for choosing which hosts to associate with the specified client certificate. If not specified, the * pattern is the default.


HTTPS Proxy support

Scan jobs can now connect to HTTPS proxies. The animation below shows the simple configuration.


New Data Exports

There are 4 new types of export available to extract scan job data to external files. The new exports are:

  • URLs
  • Parameters
  • Traffic
  • Jobs

These exports are available from multiple locations in the Venari UI. All views that show traffic have an export button and you can also choose an export type from the jobs summary view. The animation below shows the job data export UI.


Automatic Export of Job Data on Scan Completion

DevOps edition users can now configure a customizable set of exports that happen when the scan completes. The results can be emailed to specific users as attachments or as links.


New Traffic Source Imports

Venari's universal traffic framework now supports direct XML export of HTTP traffic from internal sources such as scan jobs and proxy captures. This traffic format can also be imported into HTTP collections and/or templated definitions where the traffic is annotated with {variable} syntax. The combination of targeted proxy capture, traffic export and templated traffic import allows user to create custom business logic workflows for automated fuzz testing. The traffic import panel is a powerful tool for curating traffic into reusable workflows and testing individual items as you go.

The animations below show the following steps which demonstrate the onboarding of a workflow that targets specific, ordered steps, like those required for business logic testing.

  • Export of captured traffic
  • Templated import of that same traffic
  • Payload Customization with a new variable for a parameter value
  • Creating a custom workflow that includes the customized HTTP request
  • Using the new workflow in a scan job template

Once these steps are performed once, the template can replay the traffic that exercises targeted business logic in all subsequent scans.

Exporting a Proxy Capture

Importing Traffic into a Templated Definition

Customizing a Payload

Curating a Custom Workflow

Using the workflow in a job template


Findings Comparison View

Findings comparison allows users to visualize the findings overlap and differences between two scans. The comparison UI can show the union, intersection or the findings exclusive to job A or exclusive to job B.

The process for viewing a difference set is:

  1. Export scan job A findings in JSON format from the findings view
  2. Export scan job B findings in JSON format from the findings view
  3. Click the compare icon on the left-hand tab panel
  4. Click the Clear button
  5. Click the "Import A" button and import job A findings
  6. Click the "Import B" button and import job B findings

The screenshot below shows the intersection view of the comparison UI


New Limit/Scope Controls

Version 2.3 introduces a powerful new fuzz limiting feature to prevent duplicate parameter fuzzing. This is a common use case for applications that have large URL sets where only the parameter value varies. The detection of the parameter context is automatic, and the scoping of the limit is configurable.

Example: Large number of product-specific URLs for an online store application

In the example above, the fuzzing engine can be tuned to limit the tests on the id parameter to only one. This avoids wasted scan time for a case that is clearly the same application logic for all variations of the value.

The screenshot below shows the UI in the Limits tab that selects the duplicate parameter limit mode.


Stored XSS Detection Improvements

The stored XSS fuzzing algorithm has been enhanced to substantially reduce false negatives.


Browser Discovery Adaptive Form Interaction

Improvements to browser discovery to detect when form inputs appear or disappear during interaction with the form's DOM.


Web View of Job Status and Alerts

DevOps users can now browse to the root URL of the orchestrator node and see a summary view of jobs by status. In order for this feature to work, operators must add entries to the environment section of their docker compose file (or make equivalent changes for alternate methods of setting environment variables). The value ""${MASTER_URL}"" should be replaced with the FQDN (https) of the orchestrator node.

idp__redirecturis__0: "${MASTER_URL}/index.html"
idp__redirecturis__1: "${MASTER_URL}/silent-refresh.html"
idp__redirecturis__2: "venari://idpCallback"


Enterprise User Management Improvements

DevOps edition users can now grant admin privileges to existing users in the Venari UI. Additionally, access control to applications can now be applied to job nodes and DevOps clients.


Extensions to the Workflow Language

The workflow language has new actions to explicitly wait for DOM states. The new actions are:

  • WaitWhileAny
  • WaitForAll

The complete Workflow Language Reference contains the details.


DOM Tree Auto-Formatting

The document view in the browser snapshot UI now shows a formatted version of the HTML that represents the current DOM snapshot. Indentation and de-minification helps in triaging findings. Inline frame expansion shows the full DOM context in a single view.


Job Reset

If a job is paused or failed, it can now be reset to run from a clean state. Settings can be modified on reset by selecting a different or modified job template.


New Security Rules

New security rules are added with every release. In version 2.3 the following rules have been added:

  • Host Header Poisoning
  • Missing Sub-resource Integrity Attribute
  • Failed Sub-resource Integrity Check
  • Invalid Sub-resource Integrity Markup

Debug Log Filtering

The log panel exposes fine-grained control over logging behavior. In version 2.3 there is a new feature to limit which log messages are collected. This is useful in troubleshooting scenarios and technical support calls. The debug filter rules allow the user to specify a matching pattern that causes only messages that match to be saved. See the screenshot below for an example where only logs with the word 'error' get saved.


Editing Jobs

Jobs in the completed, paused, ready, failed or cancelled states can be edited. The name of the job can be changed and different settings can be applied by selecting an alternate template.


Re-scan Jobs

Completed jobs can be re-scanned. Optionally selecting the copy checkbox makes a clone of the original scan so that the first scan is not modified. There are two main use cases for a re-scan:

  • A scan was force completed by the user and they now want to let it finish the skipped analysis items.
  • There are analysis items that completed that need to be set back to a ready state so they can run again. See the next section on queue editing for examples.


Editing Queue Items

Individual analysis queues can now be edited for specific items or sets of items. The details tab has a searchable grid with optional filtering and when the scan is paused, the visible items can be edited from their current state to the desired target state. For example, a completed item could be changed to the ready state so that the analysis will be repeated.

In addition to changing the current state of items, their priority can also be modified. When the target state is set to 'deferred' an additional text input appears to enter the ordered group name. This is an advanced technique to identify a group of analysis items that must be executed strictly sequentially, i.e., no two of them can execute on concurrent threads.

The queue item editing feature is advanced and generally used during tech support engagements and while troubleshooting custom scan configurations.

The screen shot below shows completed browser discovery items filtered by the keyword 'DOM' being bulk modified to be in the ready state.


Queue Name Limiting

Queue name field limits can now be applied in the limits tab of a job template. The name column of the job details can be used in a limit rule so that items matching the name are either limited by a max count, blacklisted or whitelisted.

The first screenshot below shows the job details tab filtered to show only urls that include index.html. Imagine a case where you would like to limit the scanner's content parser to ONLY analyze index.html URLs. To accomplish this you would make a rule in the limits tab. See the second screenshot for the rule.


Queue Priority and Isolation

The modules tab now allows users to configure the priority and isolation of items in specific analysis queues. The rule-based pattern matching is applied to the name field that is visible in the details tab of running jobs. Queue items that match the rules can have there priorities be set to a specific level. The isolation rules allow items that match the rule to be grouped into a special named group. Items in that group MUST be analyzed sequentially, i.e., there is no concurrency allowed for items in that group.

The example in the screenshot below shows the force browser module elevating any item with 'admin' in the name and shows concurrency isolation for any items in the prober queue that contain 'profile' in their names.