Blog :: DLR Consultants

Getting the most out of Docker

Dom Roberts
1st October 2019

We spent quite a lot of today working on developing new services in core for a client and having them deployed on Kubernetes through Docker. The development of the Docker platform over the last few years has really changed the way in which code is developed and deployed. We are currently working on some really exciting tools at the moment that use Docker to make old static data sources such as Excel files, dynamically accessible. The new system will allow companies to instantly deliver data APIs from their static data and have it consumed by multiple users simultaneously. That is either multiple on the same shared dataset or each user working with their own instance of the initial data. Docker and Kubernetes makes all of this possible.
It's a great new product to add to our burgeoning suite of Apps we have available

Payment Gateway Pricing

We have been investigating the use of the different Payment gateways available for a Client of ours. There are so many different options available now that it is takes a bit of investigating to decide which of the different options is best suited to your needs. It is not a matter of one solution fits all. You have to look at how your business will be receiving their money, is it a monthly recurring fee, one of payment, payment through Credit Cards, Debit Cards etc. So many different combinations result in different fee levels.

Here is a rough breakdown of the main platforms and their fees.

Charges 1.9% + £0.20 per transaction when transactions are between £1,500 and £55,000 per month. 2.9% + £0.20 per transaction when transactions are less than £1,500 per month.

Charges 2.9% +£0.30 per transaction and 3.4% for recurring payments.

Charges 1.4% for payments within Europe, 2.9% for Transaction outside of Europe. Their documentation is very extensive and makes things much easier to set up the payment system to suit your needs.

World Pay There are 2 payment plans
Pay As You Go: No monthly fees 2.75% per transaction
Monthly Subscription: £19.95 per month and 0.75% for Debit Cards 2.75% for Credit Cards.
These are the biggest systems in the world at the moment, even more popular than PayPal. Handles transactions in a huge number of currencies.

Sage Pay
Charges £20.90 per month fees plus 2.09% per transaction on Credit Card 0.74% on Debit Card.
You have to remember with all of these systems that you can either have a link through to their own widgets or payment gateways that take your user off of your site to their to make the payment, or you can develop an integrated system in to your own site using their APIs but that then requires developers. The additional benefit of using the APIs is that it can automate your backend systems as well. Using the companies own gateway systems will require someone to go in and consolidate that against your accounts each month. It is the choice of paying someone up front to build the software or the ongoing cost of the manually consolidating the payments.
At DLR Consultants we would be happy to talk you through these options and help to build out a bespoke software gateway for your business.

After the Summit

The London AI Summit is over for another year. Really enjoyed it this year with some really interesting talks. Especially enjoyed the one from Fujitsu on the Annealing system that runs on a Quantum prepared system. They made a point of making it clear that this is a digital system and not based on Q-Bits.

Following from this had a great chat with IBM about their CPLEX optimizer. This is an area of great interest for us. Optimization Algorithms are an area that we really enjoy working in and any opportunity to discuss this fascinating subject is always of interest. Certainly a tool set that we will be looking to make more use of in the future.

Had a very interesting chat with Chris from The QuantFoundry, really interesting to talk to someone behind a start up in the same field as ourselves.

Now looking forward to next year's and can't wait to see what comes out of the new contacts we have made at the conference.

London AI Summit

It was great to get down to the London AI Summit at Excel yesterday. Met up with lots of people from around the industry and it was great to see so much activity in the AI industry. The financial AI talks were great yesterday and am really looking forward to seeing the Quantum talks today.

There was an incredible amount of innovation on display as well as representation from the giants if the software industry. It all shows what a major disruptor the AI rebirth has been and how it truly is here to stay. We can only see the summit going from strength to strength over the coming years.

Is Scrum Agile enough in a microservices world?

Micro-services are one of the key components of many modern system architectures. There are many benefits to keeping the scope of a service small, one of which is the ability to deliver updates in isolation. In a well designed system a service should represent a single, functional implementation rather than performing a multitude of purposes. The code base should be relatively small and ideally well tested. This makes it a great fit for a smooth CI/CD pipeline that can handle a short release cycle.

At one of our recent clients we spent some time adjusting the deployment process to get it to a point where merges in to Trunk could easily progress through to deployment whenever required. This is an important place to be when you have high turn over of features demanded by your customers. But, how does this sit with an iterative development process such as Scrum?

Scrum is great when you have a large project or a set of co-reliant services that are released on a regular schedule. The team determines the content of the release at the beginning of an iteration and then, ideally, delivers that at the end. The result being a new released package. In a fast moving micro-services landscape though releases can occur mid-sprint. Where does that fit in the Scrum process?

Well, there is nothing to say that it can't fit. One possibility is to state as part of the sprint goals is to deliver each component as many times as required.

An alternative approach though would be to use Kanban. Kanban requires you to keep the flow of work moving, there needn't be a start and end point to the cycles. Instead Kanban can take a section of work and move it through the process quickly. Team members are reassigned to different parts of the pipeline as bottle necks build to ensure that items flow smoothly from design through to Release. In this way your items can be released to production each time a viable product is complete. Design and development start when there is capacity in those channels, the team work on each step as required rather than over the course of the iteration.

When you move to a micro-services design maybe it's also time to rethink how agile you are and how agile you could be. Scrum has been a big part of development for well over a decade now and has been a benefit to millions of teams. It's not always the correct fit though. Maybe it's time that you looked to a different approach to maximise the productivity of your Dev teams.

Add Application Insights without an Azure Account

Application Insights is a great tool for monitoring your applications. Out of the box it will send record Metrics about the incoming, outgoing traffic to your application, monitor the application state when running on the server, CPU Usage, Available memory in Bytes as well as standard Exception tracking. However, a general implementation requires that you have an Azure account in order to make any use of this data apart from when it is run under Visual Studio. That doesn't have to be the case though. We have worked with clients to add a simple system that will transfer the data to their ElasticSearch instance and a variety of other end points are available as well.

The starting point is to include the relevant Application Insights libraries to your application. Using NuGet import either the Microsoft.ApplicationInsights.Web and Microsoft.ApplicationInsights.WindowsServer assemblies to your project although it is feasible to have both set up if you need the Telemetry from both packages. The redirection of the Telemetry data is achieved through the use of an EventFlow Pipeline. You need to install a package for Microsoft.Diagnostics.EventFlow.Inputs.ApplicationInsight to bind to the Telemetry being exported from AppInsights and then add a output, in this case we are going to use Microsoft.Diagnostics.EventFlow.Outputs.ElasticSearch to send our data out to ES.

The pipeline to be created is a DiagnosticPipeline from Microsoft.Diagnostics.EventFlow which implements IDisposable thus allowing you to wrap the Run call (in the instance of a ASP.Net Core Web API service) in a using statement. The pipeline is created with the use of a sink and a HealthReporter. To create the HealthReporter we have used a CsvHealthReporter using the following,

var reporter = new CsvHealthReporter(new CsvHealthReporterConfiguration());
Then an insightItem can be created with
var insightItem = ApplicationInsightsInputFactory().CreateItem(null, reporter);
and an inputList with
var inputs = new[] { factory};
The sink is then constructed as an array
var sinks = new[] { new EventSink(new ElasticSearchOutput(new ElasticSearchOutputConfiguration { ServiceUri = , IndexNamePrefix = } , reporter), null);
using(var pipeline = new DiagnosticPipeline(reporter, inputs, null, sinks, null, true)) { }

You should now have configured AppInsights to generate the Telemetry as configured in your AppInsights configuration and have it sent through to your ES instance.

More work is required to then prevent the POST messages that are sent to ES from being listed repeatedly in the Telemetry as DependencyTelemetry calls, but I'll save that for another post.

Tournament Selection in Genetic Algorithms

Genetic Algorithms are based on the Darwinian concept of survival of the fittest. Improving the fitness of a solution is achieved by breeding from the strongest genomes. Tournament Selection is a powerful way of implementing this. On each iteration of the algorithm multiple nodes are assigned to a tournament of a fixed number of competitors. After the scoring of the fitness function the competitors are compared and the winners are moved in to the next iteration.

There are many variants of how to handle the tournament, how many competitors to include, what to do with the losing participants etc. One way to avoid getting caught in local minima is to maintain some losing nodes into a further round or moving them into another tournament to cross polenate other segments of the batch. In tests, during the building of a GA for the N-Queens problem, varying sizes of tournament were tested. The initial implementation of a 2 member competition showed good improvement over no competition. However, a tournament size of 3 was able to more rapidly remove invalid combinations and hone in the correct sequences.

Tournament selection is a key tool in the armoury of a Genetic Algorithm developer. Each application of the GA varies in the way in which it should be approached. Testing the parameters and variations in the implementation is an important step in the development process.