Comparison: Dynamics NAV 2018 vs 2016

After reading a few blog posts about the performance of NAV 2018, I decided it might be interesting to do some comparison runs to see if there were any significant differences. The first caveat here is that these runs are intended to compare the NAV server performance between 2 versions and don’t take into account network bandwidth, latency & database server latency.

The load tests are run against the standard NAV W1 demo application with the initial demo data. I ran the NAV servers on the the same virtual machine using the latest nav-docker images. The NAV Server versions used for the comparison were NAV 2018 CU4 and NAV 2016 CU18.

I ran the NAV Load Test Order Processor scenarios with a maximum load of 50 concurrent users. In the test scenario user creates one or more sales orders with 10 lines. I ran the load test for 10 minutes with approx. 2000 server transactions completed in each test.

The results were not very interesting. After doing repeated test runs the differences are small (< 0.2 sec) and probably not noticeable to the user. Here is an overview.

NAV2018PerfComparison

This comparison does not mean that users will not experience performance differences when upgrading to NAV 2018. How a customer solution will perform depends on the application used and the size of the database. As described in the blog post by Alex Addressing Performance Problem in Dynamics NAV and Dynamics 365 Business Central the addition of charts and tiles on some standard application pages can have a significant impact on performance.

In summary, there is nothing to indicate that there is a significant performance regression in NAV 2018. Let me know if you think otherwise and have an application that you would like to benchmark.

Advertisements

Dynamics NAV Load Test Framework Updated for NAV 2018

I just updated the the Dynamics NAV Load Test Framework for NAV 2018. The changes are mostly due to simplification of the demo application, like record number lookups that are changed to name lookups. There are also a few extra confirmation dialogs that need to be handled. The Small Business role center pages have changed, so I refactored the scenarios to use the Purchasing Agent role center. Finally, the test assemblies have been updated the the latest version which is available in the NAV 2018 downloads TestAssemblies folder.

You can see all the changes here.

I have been running a few comparisons with NAV 2016 using the Nav Docker images. I will post some results soon. Stay tuned.

 

 

Creating a NAV Server Availability Set using the Azure Load Balancer

This post describes the steps needed to setup NAV Virtual Machines in an Availability Set behind an Azure Load Balancer. This provides high availability for an NAV Server and simple load distribution. Before reading you should already be familiar with the NAV Azure Image, the NAV Demo Environment and the Azure Resource Manager. You can read more about the Load Balancer here: Azure Load Balancer Overview

The following Azure PowerShell script creates an Azure Resource Group with two NAV Virtual Machines in an Availability Set and a Load Balancer with rules configured for the NAV demo environment on the Dynamics NAV 2016 gallery image.

Before running the script you need to be connected to your Azure Subscription using the Login-AzureRmAccount cmdlet and you need to update the $testName variable to something unique and meaningful. The script will prompt for the admin credentials for the virtual machines to be created.

When the script completes and the Virtual Machines are created, you can then connect to the virtual machines using via RDP using the Azure Portal and run the “Initialize Virtual Machine” script to create the demo environment. When prompted for the cloud service name you should provide the FQDN for the Load Balancer PublicIP, the FQDN is displayed at the end of running the script. If you are using self-signed certificates you can use the certificate generated for the first virtual machine, when running the script on the second virtual machine.

The Load Balancer has rules that enable requests to the default site on Port 80, the NAV Web Client on port 443 (HTTPS) and the NAV Client Service on port 7046. These ports works with the NAV demo environment on the Dynamics NAV 2016 gallery image.

The Azure Load Balancer distributes requests between the two virtual machines. The Load Balancer rules for port 443 and port 7046 are configured with session persisitence so that once a client creates a session with on of the virtual machines the load balancer continues to direct requests from the client to that virtual machine where the NAV client session has been created.

 

Here is the script which is hosted in a Gist:

 

Differences between NAV Unit Tests and Performance Tests

A recent question on NAVLoadTest that asked about a possible “Combined Unit & load- testing module” using the NAV Application Test Toolset got me thinking about the differences between Application “Unit Tests” and Performance Tests. There are some important differences in the goals and the design of Performance Tests like the NAVLoadTest scenarios and Application Tests written using the NAV Application Test Toolset .

  1. Goals:
    1. Application Tests are designed to verify correct functionality of a module.
    2. Performance Tests are designed to measure some aspect of system performance.
  2. Scope:
    1. Application Tests are designed to test individual C/AL objects and methods in isolation. The tests are executed in the NAV Server only so there is no client-server communication involved.
    2. Performance Tests test end-to-end user interactions. They run using the NAV client service which is hosted in IIS. This means that the tests measure the resources consumed by the client layer, the NAV Server, SQL Server and the communications between those layers.
  3. Data Isolation:
    1. Application Tests are designed to be data-independent and be executed in isolation from other tests. Any changes done to the database through running of tests from the Test Tool are automatically rolled back using the Test Isolation feature.
    2. Performance Tests are dependent on existing data, create persistent data and are impacted by the data created by other tests. One of the goals of the Load Tests Scenarios is to observe how the test scenario performance changes as the dataset grows. One of the hardest parts of writing load test scenarios is ensuring that the test continue to run predictably as the dataset changes.
  4. Test Verification
    1. Application Test tests follow the “Arrange – Act – Assert” (see http://c2.com/cgi/wiki?ArrangeActAssert) pattern of unit tests. They ensure the state has not been changed unexpectedly during the test.
    2. Performance Tests have no way of controlling the initial state as other tests can be running concurrently on the same database. They must be resilient to changes in data and possible errors that occur during test execution and handle them appropriately as a user might. For example the “another user has locked the record” error occurs frequently in load tests when there is a concurrent user load.

There are probably more differences that I didn’t cover. When writing performance tests you may find it easy to start with the scenarios used in some application tests but I find that whenever I attempt to reuse an existing test as a performance test I end up needing to rewrite the test to cover situations that don’t occur in the original test.

Recent Updates to the NAVLoadTest Repository

After returning from Directions I have made 2 updates to the NAVLoadTest repo.

The first change the result of a request made during the Directions workshop to use lookup controls when selecting records randomly instead of opening list pages. The InvokeCatchLookup extension method invokes the Lookup SystemAction and expects to catch a Lookup Form. See “Feature/use control lookups”.

The second change is to add the basic Small Business User scenarios to the project to demonstrate how to use other role centres and pages.  See “Feature/small business scenarios”

NAV Performance Test Toolkit Supports NAV 2016

I have just spent 3 days at Directions EMEA 2015 where I presented the NAV Performance Test Toolkit with Freddy and did a workshop on writing performance tests. The workshop had good attendance and I got some good feedback and suggestions for improvements. You can see an issue created during the session here: Issues.

I have recently updated the NAVLoadTest repository with support for NAV 2016. The changes for NAV 2016 include the references to the updated NAV Client Framework Library and some changes to the authentication code. The Client Framework library appears to have had some significant updates and now uses JSON over HTTP instead of WCF. Take a look at the communications while running the tests using Fiddler if you are interested to see how that works.

Please add your requests for improvements and other feedback to the  NAVLoadTest issues list. This is the main repository for the toolkit and will always contain the latest version of the toolkit. The other repositories won’t change so often as they are used for demonstrations and hands-on labs and need to be kept in synch with the other demonstration materials.

There are a few other improvements made recently, in particular a change that allows you to add more than 5 records to a list (see the NAVLoadTest Commits for details).

It was great to hear from so many people that are using the toolkit. I hope to be able to push some more improvements soon. Stay tuned.

How to Write NAV Load Tests Scenarios Using Visual Studio

I created 2 videos about writing simple load test scenarios for Dynamics NAV using Visual Studio. The first video covers a simple scenario for opening and closing a page. The second video shows how to write a scenario to create a Purchase Order. Both examples also show how to add tests to the Visual Studio Load Test.

The videos are available on You Tube: