This week, we had the opportunity to be at the unveiling of the Infinix Zero 4 and Zero 4 plus.
In the video below, we talk about everything we know so far, concerning these devices.
pglogical is a logical replication system implemented entirely as a PostgreSQL extension. It is fully integrated, and requires no triggers or external programs. pglogical asynchronously replicates only changes in the data using logical decoding . This makes it very efficient as only the differences are replicated and also tolerant to network faults since it can resume after the fault. After a frustrating session trying to setup pglogical, I've decided to document some of the quirks and missing information from the documentation. All in all, the documentation is great, but there are some things I wish I knew before I started which would have made my experience less painful. Note that I won't be covering the installation process as it is well documented. However, I do have a postgresql ansible role for Ubuntu that handles it as well if you want to go that route. Note: this post does not attempt to be a replacement for the official documentation and only serves to p
We generally use auto-increment integers (or serial in the case of Postgres) for our primary keys . However, by default, the Identity Framework in MVC is set to use string . In this post, we detail how to change this to work with integers. First create your custom models for IdentityUser and IdentityRole . public class ApplicationUser : IdentityUser<int> { } public class ApplicationRole : IdentityRole<int> { } Then update your dbcontext to use your custom models. public class ApplicationDbContext : IdentityDbContext<ApplicationUser, ApplicationRole, int> { ... } Finally, we change the configuration of the Identity service in Startup to match. public void ConfigureServices(IServiceCollection services) { ... services.AddIdentity<ApplicationUser, ApplicationRole>() .AddEntityFrameworkStores<ApplicationDbContext, int>() .AddDefaultTokenProviders(); ... } If you are using migrations, you will need to updat
We recently completed a rewrite of a client's application which handled generation and delivery of statements. With 6 months of hindsight, we had the opportunity to do this properly. In this series, we will be talking about some issues we faced and how we solved them. Choosing our Technology Stack The entire process is broken down into the following steps: Create a job defining batches of companies to process. Each batch contains a list of items representing each customer in that company. Each item is responsible for generating the customer's statement, uploading it to S3 and sending the customer an email. When items in a batch are complete, the customers' statements are combined into slices of 100, uploaded and an email sent to the company's representative. When all batches are done, the job is marked as complete. The Foolishness of Youth Our initial solution used Hangfire to handle our queues and background processing. It seemed like a good idea a
Comments
Post a Comment