1 Introduction #
1.1 Implementation options #
Source
BI4Dynamics can join BC Cloud with your legacy BC or NAV in one fully integrated BI solution. First selected source in BI4Dynamics must be BC Cloud. BC or NAV can be added optionally.
Destination
- Data warehouse implemented on Azure SQL Virtual machine
- Analytics can be implemented on the same Virtual machine in SQL Server Analysis Services
or as Azure Analysis services or Power BI Premium.- Use same Virtual Machine when costs for VM are low,
- Use AAS when performance, scalability, and flexibility matters.
2 Azure SQL Virtual Machine #
2.1 Permissions and Requirement #
BC Cloud
- Dynamics 365 administrator (How-to) role,
- BI4Dynamics BC Extension must be installed on the Production environment only.
*Please note: Any other extensions you might be using in BC and plan to include in the BI project must
be installed on Production and Sandbox environment.
Azure Portal - Application Developer (How-to) for application registration,
- Administrator on the Azure VM where BI4Dynamics is installed,
- Azure Storage info: Subscription, Resource group, Account name, Container name and Blob Key.
2.2 Azure SQL Virtual Machine – hardware #
We’ve created a document which explains in detail how to do Virtual Machine Installation.
Use Azure SQL option and select Virtual Machine with SQL 2022 Standard edition preinstalled.
Do not install Virtual machine, as it requires a separate SQL server installation.
Azure SQL Virtual Machine Size
VM processing time and consequently size mostly depend on BC database size. Here are 2 examples:
BC Cloud database = 5 GB
BC Cloud database = 20 GB
Calculations are based on DW processing time of 1 hour, 20 times each month. View latest pricing here.
Attached disks: Azure SQL VM comes with 128 GB of C drive; OS and SQL files are using 40 GB. 80 GB is enough for DW processing of BC databases up to 30 GB. If BC database is bigger, add disk. Smaller disks are very slow, while bigger disks are much faster, so you may choose larger disk because of speed, not size.
Azure VM Setup: Virtual Machine should be configured with Public IP and DNS name
2.3 Azure SQL Virtual Machine – Software #
Operating system: Azure SQL VM comes with the latest Windows Server 2019 or Windows Server 2022.
SQL Server (software) – following features must be installed.
- SQL Server Database engine
- SQL Server Analysis Services Tabular (when implementing Analysis Services locally)
- SQL Integration services
- PolyBase Query Services and Java connector for HDFS data sources (not required for SQL 2022)
PowerShell, .NET Framework version 4.7.2 or higher
Azure CLI: set of commands that create and manage Azure resources, available here.
Latest versions of system software on Azure VM
It is of utmost importance that all system software used in BI project is of the latest version (or latest – 1) and constantly updated. Any neglect of this task might result in inability for BI4Dynamics to be installed or properly operate.
ⓘ Note: Most common misunderstanding is that SQL updates will come automatically with Windows update. SQL server updates do not come with Windows update automatically. SQL updates must be downloaded manually from Microsoft page as CU update (cumulative update). Updates are coming frequently, every few months.
https://learn.microsoft.com/en-us/troubleshoot/sql/releases/download-and-install-latest-updates
Even the original Microsoft Azure VM image does not have the latest version of SQL Server and thus must be updated manually.
2.4 Other resources #
Azure Storage Account: Storage Account with Container used for storing BC tables exported from BC.
Azure Docker: Specialized cloud engine that manages the export of BC tables to Blob Storage.
3 Analysis Services #
We do support only Tabular model of analysis database. Tabular model can be implemented on:
A. Azure VM or local server,
B. Azure Analysis Services,
C. Power BI Premium as XMLA endpoint feature.
Permissions
Azure Portal | Analysis Database | Power BI Premium | |
Administrator | Analysis Services Admin | Server Administrator | Workspace Admin |
Business User | Reader Role | Read Role | Viewer |
3.1 SQL Analysis Services on Azure Virtual Machine (A) #
Permissions
- Admin rights to Analysis Services Instance (How-To)
- Configure the Analysis Services service to run under an administrator account.
3.2 Azure Analysis Services AAS (B) #
AAS in memory database available for user queries by any BI client (Excel, Power BI)
Permissions
- Administrator rights on Azure Analysis Services in Azure Portal and
- Administrator role on SQL Analysis server using SQL Server Management studio (How-to)
- Configure the Analysis Services service to run under an administrator account.
On-Premises Gateway: provides bridge data transfer between on-premises data and Microsoft cloud service.
3.3 Power BI Premium (C) #
Permissions
- Admin role in Power BI Premium is required to update workspaces, contributor role is needed to publish reports. (How-to)
- Configure the Analysis Services service to run under an administrator account.
On-Premises Gateway: provides bridge data transfer between on-premises data and Microsoft cloud service. Additional information on purchasing Power BI Premium subscription could be found here.
Detailed information about:
- How to create and configure Azure Virtual Machine with SQL server
- How to Install BI4Dynamics application and configure On-premises Data Gateway
4 Best practice and recommendations #
4.1 SQL Server features and settings #
SQL Server editions
All BI4Dynamics features run on Standard SQL Edition. It also runs on Enterprise edition. Per instance Limitation of SQL Server Standard Edition:
- SQL Server engine Buffer pool 128 GB
- Analysis Services Tabular 16 GB
When data size requires more hardware resources, BI project should be split into two instances:
- Finance (Finance, Receivables, Payables, Fixed Assets, Bank Account)
- Operations (Sales, Inventory, Purchase, Retail, Manufacturing, Warehouse, Service)
SQL Server settings
Volumes, drives: Separate volume/drive for Data (#1) and Log & Temp (#2).
Temp db: set 4 or 8 data files and 1 log to avoid the GAM/PFS page contention issue described here.
Collation: the collation of the data warehouse server should match the collation of the BC database.
UAC should be disabled, or application needs to be installed outside Program Files folder.
Network availability:
- Online servers: The server is connected to the Internet with port 80 opened (if the server is under proxy, the application needs access to our authorization web service)
- Offline servers: special offline license is required; please contact sales@bi4dynamics.com
Integration services
Integration Services can speed up the processing Stage and Data warehouse from 50 to 150% by running stored procedures in parallel. If not installed or selected processing will be much slower.
PolyBase
Install PolyBase Query Services and Java connector for HDFS data sources. These features are a part of SQL server used for (among others) loading Blob storage files (not required for SQL 2022).
Columnstore option
BI4Dynamics supports Columnstore storage in staging (schema = stage) and Data warehouse area (schema = fact). Columnstore can achieve 10X compression. Dim schema is not supported with Columnstore as dimensional table are usually smaller and compression does not help performance. Loading may be increased by 10-20% due to building Columnstore indexes.
4.2 Faster processing and querying #
Fast data warehouse processing
- Fast disks: disk speed is the most important part as data warehouse reads and writes a lot of data
- Enough RAM: RAM amount under the recommended size will significantly affect the processing speed.
- Good CPU with many cores
Fast Analysis Services processing and querying
- Data warehouse engine requires fast disks, many cores for parallelism.
- Tabular engine requires fast CPU and fast memory.
These are hardware components that affect performance for Tabular engine:
- Fast CPU speed is better and will affect query time (interaction between front end tools (Excel and Power BI) and Tabular engine when a user is analyzing data
- Fast memory: fast memory can be more expensive, but it is worth the investment
- Memory size should be 2X of sum of size of all Tabular databases; If there is not enough memory that data will be partly written to the disk and performance will be slow or not perform at all.