The 6.x versions of Metric Insights represent a new era of modern container-based deployment (allowing for horizontal scaling of specific services), support for significantly larger Datasets via Microsoft SQL Server (the metadata/application DB will remain in MySQL/MariaDB), and the introduction of the Data Processor and Remote Data Processor services (deployed as containers).
6.1.x introduces improved column change management for Datasets, and several new features to assist with multi-tenant enterprise-scale deployments (new privileges, new options for content migration). We're excited to welcome you to this new era.
6.1.1 marks the official end-of-life for creating new Legacy Reports. Existing Reports will continue to function, but you can no longer create new Legacy Reports. This is the next step in the eventual full retirement of Legacy Reports
6.1.2 moves Global Search into Beta; it also brings a suite of new features for troubleshooting Notifications Schedules and Bursts, expands options for Alerting, extends our API in significant ways, adds workflow improvements to Dataset Reports, and introduces support for MicroStrategy Dossiers.
6.1.3 includes many enhancements for administrators, specifically around Bursting and overall system monitoring. Power Users may find our new Burst tracking useful, and content creators might notice the absense of our SQL Query Builder—we considered it a legacy component and removed it entirely, instead displaying available table names to the right of the fetch command.
- Docker-based deployment (originally introduced in 6.0)
- Container orchestration (including templates for Kubernetes and Amazon ECS)
- Rearchitected Data Processor (originally introduced in 6.0) and Remote Data Processor (replacing the Remote Data Collector)
- High-volume data loading via MS SQL Server Agent
- Smarter Dataset column management
- Easier scripted content migration
- New Power User Privileges and Permissions for JS and Email Templates
- [6.1.1] Dataset Reports: Formula-based Variables in Text Blocks
- [6.1.1] Dataset Reports: Line/bar charts can now be driven from Snapshot Date
- [6.1.1] Portal Page Asset Folders have been introduced for improved Asset management
- [6.1.1] Improved Custom PowerPoint Templates
- [6.1.1] Power Users can now be granted a privilege to create/edit Managed Alert Workflows
- [6.1.1] Alpha feature: Global Search functionality can be turned on for early testing and feedback
- [6.1.1] Monitoring Tool for system admins to keep track of all running Metric Insights services
- [6.1.2] Beta feature: Global search has been improved further in preparation for GA in 6.2.0
- [6.1.2] Dataset Reports: Easily hide and unhide columns
- [6.1.2] Dataset Reports: Tables contains no rows
- [6.1.2] Metrics: Support for higher volumes of data points
- [6.1.2] Alerting Enhancements
- [6.1.2] Notification Schedule & Burst Troubleshooting
- [6.1.2] Plugin Enhancements
- [6.1.2] API Enhancements
- [6.1.3] Burst, Notification Schedule and Trigger Troubleshooting
- [6.1.3] Burst Link Tracking
- [6.1.3] Status Monitor Updates
- [6.1.3] Security Model Changes
- [6.1.3] A dozen other changes and improvements arrived in 6.1.3...
The 6.x architecture is built from the ground up to be deployed in Docker with container orchestration. 6.1.x ships with configurations for Kubernetes and ECS.
For help configuring your infrastructure-as-code tooling (e.g. Terraform, CloudFormation), contact [email protected]
Network File Systems (NFS) are now a requirement for permanent storage in the 6.x architecture. In 6.1.0, we ship with configurations for the Kubernetes-focused Portworx storage and data management tool as an additional option.
Larger Datasets with volumes in the tens of millions of rows (depending on the number of columns and the performance of your database instance/cluster) are supported via Microsoft SQL Server as of 6.0—in 6.1.0, we've introduced two methods of loading CSV data, either via a Shared Mount on the server, or an agent, for performant data loading into the SQL Server database.
There are two data upload options:
1. Create a local folder on Microsoft SQL Server where CSV file is stored and bind it with the “/opt/dp” folder which is located in Data Processor's Docker container.
2. Create a Shared Folder that can be access by the Metric Insights Storage Engine.
The agent must be deployed in the Microsoft SQL Server environment in order to load large volumes of data. This means you will need SQL Server on a virtual or a baremetal machine where the agent can be deployed.
Please contact [email protected] if you plan to load high volumes of data into Microsoft SQL Server and we can assist in the process.
If a column is not being used in a Dataset, it can now be removed without causing any downstream issues.
If a column is being used in a meaningful way in any downstream object, you will be explicitly warned about the downstream objects impacted while doing things like editing a Dataset View.
Building on Scripted Content Migration introduced in 5.6.1, the 6.1.0 release introduces the ability for content creators to select the content they would like to migrate on their own.
Ensure that the config variable DISPLAY_MIGRATION_OPTIONS is set to Y (the default is N)
- In the Element (or Category) Editor, select the "Include External Report..." option
- Wait until the next Scheduled Migration runs (this will depend on the script's schedule)
Release 6.1.1 introduces support for Portal Page migration via the same method.
For further details, explore Scripted Migration via Category and Element Editors
For assistance, contact [email protected]
In support of multi-tenant deployments that require Admin-like Power Users, we've rounded out some of the final remaining areas wherein Power Users could not previously administer the system.
In 6.1.0, we now provide Edit Access Privileges to Power Users and allow them to grant Permissions for:
Users can now build formulas that are calculated based on the column data in a Dataset Report, and display these as Variables that can be included in the Text Block component.
For details see, Adding Variables and Formulas.
When using a Snapshot Dataset as the source for a Dataset Report, you may now select Snapshot Date as the value source for the X-axis in the Line/Bar/Area chart component.
We have introduced Asset Folders to ease the management of the Assets utilized in the construction of Portal Pages. These can be shared explicitly with Users or Groups intended to manage a particular Portal Page. To review associated Security changes, see Portal Page Security
Files can be uploaded individually as ZIP archives.
Release 6.1.2 will introduce nested Asset Folders that will support multiple levels of hierarchy (e.g: Parent Folder, Child Folder, Grandchild Folder, and so on)
We have completely rearchitected our support for custom PowerPoint templates. The functionality has been simplified to a common set of variables that can be included in a POTX. Once setup in PowerPoint, the template is uploaded to Metric Insights. The template can then be selected when setting up a Burst or Favorites Folder and the PPTX that is subsequently generated will honor the custom template.
Global Search allows for search across the majority of Metric Insights content, including Elements, Datasets, Bursts, as well as the content of extracts from external systems (via specified Datasets). We will be adding support for Portal Pages in a future minor release. We are also considering expanding this to Admin-focused objects like Notification Schedules, Data Collection Triggers, and the many other building blocks and objects that exist within the platform.
- You can turn on search via the ENABLE_GLOBAL_SEARCH Config Variable
- Search will only be enabled for Power Users and Admins
- The functionality requires MySQL v5.6+
While still an Alpha (early preview with limited testing), the functionality provides a great example of what's to come to all Metric Insights users in Release 6.2
Please provide any feedback you have about search to [email protected]
The new service-based architecture of Metric Insights, wherein multiple services run in parallel (each providing a specific set of functionality), requires a new approach to monitoring. We've introduced a tool for monitoring the health of all services necessary for operating the platform.
A hostname and credentials for accessing the Monitoring Tool are defined when setting up Metric Insights.
Hiding and unhiding all or some columns in a table is now as easy as selecting checkboxes.
There are many scenarios in which an empty report is just as valuable is a report with data because you need to know that there are no exceptions for that day. We've extended our "no rows" functionality to allow for a custom message, for example: "There are no exceptions for today."
Data points on a Metric are now lazy loaded, supporting far higher volumes of points overall. This is helpful for hourly Metrics with a lot of history.
- An "Alert Window" override exists on Metrics, instead of just Measurement Intervals
- You may now choose to "Always match payload measurement date to alert"
- The Notification Schedule owner can choose to receive an error report via email if the Notification Schedule run is automatically aborted/timed out
- Error reports can now be sent to multiple email addresses (separate by comma)
- It is now possible to abort a Notification Schedule run
- MicroStrategy Dossiers are now supported and are the preferred Filter/Prompt in MicroStrategy
- Qlik Sense support for formula-based filters
- Several new endpoints for Bursts:
- Create Burst
- Assign to Notification Schedule to Burst
- Add content to a Burst
- Assign ownership (for example, set current user as Burst owner)
- New endpoint for getting a list of all Plugin Connection Profiles
- New endpoint for getting a list of all available Notification Schedules
Bursts and Notification Schedules now have more robust logging and greater detail exposed about each Burst (its contents) and each Notification Schedule run.
Burst creators now how the option to track links clicked within a Burst. Thes new capabilities are available on the Customize tab in the Burst Editor.
The Status Monitor has been updated to include a new overview design that includes Mounted Volumes and a new Nodes tab for monitoring the individual nodes that together make up a complete deployment. Learn more about the Status Monitor.
- We now only allow Power Users to share Folders on Homepage with Groups of which the Power User is a Member or Owner, and to Users who are members of these Groups
- New Privilege added: Allow Power User to subscribe or grant Burst Access to any User or Group
- Alert Simulation no longer applies the Alert window to the simulation—this should remove any confusion when setting up an Alert.
- Data Storage changes can be tracked for auditing purposes. When Admins and Power Users with Edit Access use the Data Storage editor (for editing Data Storage integrations like Microsoft SQL Server), all meaningful changes are tracked.
- Dataset Validation can now be limited to a maximum number of rows during the data load process via the DATASET_VALIDATION_MAX_ROWS Config Variable.
- Content Discoverability functionality introduced in 5.6.1 can now be set as the default via the NEW_CONTENT_IS_DISCOVERABLE Config Variable.
- Access Request emails are now prettier (these are received by admins when an end-user requests access to content)
- [6.1.1] Data Storage functionality now includes automated Health Checks to ensure that a Storage Engine always has capacity available
- [6.1.1] Jira Plugin's Visual Editor now supports "In List"
- [6.1.1] Empty Dataset Reports can be included in distributions
- [6.1.1] Legacy Reports can no longer be created
- [6.1.1] Portal Pages now honor security restrictions on User Mapped content
- [6.1.1] Ability to omit/hide FIlters from External Reports
- [6.1.1] External Report Types now offer the ability to suppress the Notifications icon on the External Report Viewer (across every Report with that Type)
- [6.1.1] Element Tile Previews now display Expert Analysis in the summary information
- [6.1.1] Data Collection Trigger runs now count Datasets and User Maps (previously, they were excluded from the metadata, although the did run with the Trigger)
- [6.1.2] Datasets that have been disabled because of broken dependencies will now auto-enable (but only in situation where they were automatically disabled)
- [6.1.2] Datasets now support custom Measurement Time Calculation Commands. This will enable custom measurement times beyond today and yesterday, for example: 2 days ago
- [6.1.2] Dataset Reports Fix: Duplicating a report will now copy Filters and preserve formatting
- [6.1.2] Portal Pages now allow for usage of our External Reference Hierarchy control—this will allow for the selection of a specific object in a BI tool (like a Tableau Worksheet)
- [6.1.2] Bursts are now listed with special colors based on their status.
- [6.1.2] External Report Templates have been introduced so that you can programmatically build new External Reports based on a template.
- [6.1.2] External Report Types have an updated setting called "Allow iframe Embedding". This is a simple label change to make it more clear what the setting does: allow the External Report builder the option to embed a report via iframe, instead of just grab static images (previously it was "Allow Embedding of Live Visualization")
- [6.1.2] Homepage Tile Rebuild is a new option is present for admins who need to rebuild all tiles on the homepage. Admin > Utilities > Rebuild Homepage Tiles. This is usually done when major security/access changes have been made and tiles are no longer accurate.
- [6.1.3] Datasets and Elements no longer include SQL Query Builder—it was a legacy component that we have said goodbye to 👋
- [6.1.3] Remote Data Processors now have a more clear UI, we removed some legacy labels and cleaned up the list and the editor
- [6.1.3] Data Collection Triggers now support entering multiple email addresses for error reports
- [6.1.3] Import/Export will now include uploaded images when exporting External Content elements
- [6.1.3] External Reports now support an "All" value
- [6.1.3] New User Defaults now support the ability to select a Portal Page as the start location for a user
- [6.1.3] Data Collection Triggers have an updated "Trigger Now" function that will display downstream dependencies
- [6.1.3] Access Requests for content will now CC (send the same email to) the requesting User with each content access request (previously, they were only sent to Admins meant to handle requests)
- [6.1.3] Email Templates have been removed: "Favorite" and "Favorite Simplified" are no longer included with the product
- [6.1.3] Dataset Views now default to Public when they are created
- [6.1.3] Dimension Value Key and Display columns are now the same by default
- [6.1.3] New Config Variable: Ability to hide the "My Mobile" menu item for all users via DISPLAY_MENU_ITEM_FOR_MY_MOBILE
- [6.1.2] We've added support for Red Hat EL / CentOS 8
- [6.1.2] We now offer a Vagrantfile for CentOS 8
- [6.1.2] We now support Docker Swarm
- [6.1.2] The Installer will now allow you to set your desired timezone
- [6.1.2] mi-db-move now supports Azure MySQL
- [6.1.2] We've added AWS Cloudwatch logging into our Cloudformation deployment template (similar to the Terraform template that already existed)
- [6.1.2] We will now stop an installation if Linux has a umask setting that is not the default
- [6.1.2] We've added a --db-name option into the Installer what will allow adding a database prefix for the application
- [6.1.3] We implemented some changes to prevent Slow HTTP Attacks against Apache
- [6.1.3] We now provide the ability revoke the MYSQL_ROOT_USER (super user) access after installation
- [6.1.3] We added more garbage collection for temp files in /opt/mi/iv/data/temp, including nb_* (no extension) files, files whose name consists of digits only (no extension), *.html, *.png, *.pdf, *.csv
- [6.1.3] The Web and Data Processor containers now include the following system tools: ping, telnet, wget, vim, netstat
- [6.1.3] We adjust the default memory allocation to the Data Processor
- [6.1.3] Our Docker Swarm deployment manifest now accepts custom named NFS shared volumes (not just /opt/mi/data)
- [6.1.3] Our Docker Swarm deployment manifest will now generate a 'credentials' folder with the necessary *.env files
- [6.1.3] Portworx: If a custom class name is provided, we'll automatically comment out "kind: StorageClass" in our deployment manifest
- [6.1.3] Credentials for the monitoring service have been added to insight.conf
- [6.1.3] We'll now tell you if we can't find the Python interpreter
- [6.1.3] We now support Amazon Linux 2
- [6.1.3] mi-ldap-usersync now allows for mapping to Tableau Trust Auth users