Testing and Comparing 5 Rest Api Clients

Know your client performance first

Testing and Comparing 5 Rest Api Clients While I was working on a performance test of my folaris server, I wanted to find out how well it performs. The performance obviously depends on the available network bandwidth and CPU power. I was happy to see it performs well without using a lot of memory and processor time. First I tested with Powershell/Invoke-RestMethod on localhost interface, I got an average 606 calls per second. [Read More]

Aerospike Evaluation

Launching an Aeospike server Docker container and testing with dotnet core client

Summary Aerospike is an Open Source NoSQL database. It has some similarities with Redis. The whole database can reside in memory or on file system. It uses the term of Record, equivalent of a row in RDBMS. The record size is limited by server startup configuration write-block-size parameter. Currently supported size is 1, 2, 4, 8 MB. Each record cannot exceed the configured size. It is not designed to support any type of application. [Read More]

Sqlalchemy Table Copy

Copying small data sets using sqlachemy

sqlalchemy is widely used. For example, Pandas read_sql uses sqlalchemy to connect to a supported database. In this test, I want to transfer all records between two identitical tables. It is useful when you need to move data between 2 non-production environments. This is not the most efficient way to move data, but I will show here that it’s very a convenient way to move small data sets under 500K records. [Read More]

Testing Apache Airflow

Apache Airflow DAG and plugin test

Code are on github Summary Apache Airflow is Python based workflow scheduler. You have to know Python to use it. I will focus on how writing a DAG, the workflow definition in Python. Then I will adapt the code into an Airflow plugin to make the DAG file more readable. References: Apache Airflow Github Airflow Docker image DAGS The Python workflow definition files are stored in a subdirectory defined in airflow. [Read More]

AWS SAM CLI

AWS Lambda tests

AWS Lambda looks simple when you look at the code. It’s also really easy to see the benefits. But it’s tedious to get to it to work. Take a look Alex Edward’s very good tutorial. AWS SAM CLI is aimed to make it simpler. Testing aws-sam-cli My sam cli testing was based on code generated by aws-sam-cli. Local testing is done using Docker images from lambdaci/lambda. I also described the issues running Docker containers that need Internet access, and how to solve them. [Read More]

A Piece of Cake

Testing dotnet tool Cake or C# Make

Summary Cake’s main purpose is to use C# to write your build logic. It’s similar to RAKE, FAKE. Cake web site has pretty good documentation about its features and usage. I will test only following: Install Cake Set up VS Code Demo how Cake manages dependencies when you need to write code in C# Installation dotnet tool install --global cake --version 0.32.1 You may also use Cake bootstrap script to install and execute. [Read More]

Get a taste of Go

Run Postgres query and send results to a csv

Today is the first day of Spring, time for something new. Summary use Go to access the lab Postgres server running in Docker container save query to a csv file check out how testing works Is Go, the Programming Language of the Cloud? You decide. The code Source code is on My github repo I had to spend more time than I thought to figure out how to get nullable table column values. [Read More]

Jdk 11 new features

Single-file source code and a starter Maven project for JDK 11

Sumary Testing a single-file source program A Maven project: jdbc and csv export Today Oracle just released JDK 12. Like most developers, I am still using JDK 1.7 or 1.8. But I am curious about it. So I spent some time on JDK 11. Single-file source code Basically you do not need to compile first before you run. Java single-source has no dependency management: you still have to set up classpath just like you run a compiled Java class file. [Read More]

Dotnet Core Test

Postgres access with Entity Framework and ADO.Net

Following lab setup, in this post, I will show how to use dotnet core to access Postgres db. Summary Create a dotnet core console project use Entity Framework to generate Entity Framework context and entity code write code to run a query using Entity Framework write code to run a query using ADO.Net/Npgsql run a speed test Create a dotnet project and add Postgres packages dotnet new console -n CmdlineApp cd CmdlineApp dotnet add package Npgsql --version 4. [Read More]

Lab Setup for a Test Drive

A Postgres server and developer computer setup

Source code: My Test-Drive github repo Summary build a Postgres Docker image start a new container get data from College Scorecard web site perform an ETL to get a subset of the data install PostgresSql client on Windows load data to server Build a custom Docker image based on Postgres 11 Dockerfile and create user/db script FROMpostgres:11.2COPY ./create-bruno-db.sh /docker-entrypoint-initdb.d/create-bruno-db.sh create-bruno-db.sh This file must be executable If you edit it on Windows, make sure you convert Windows CR LF to Linux new line charater LF [Read More]