Notes From a Full-Stack Developer

Kevin Raffay's WordPress Blog

Angular 4 and CSS

leave a comment »

I was working through the latest Angular tutorial and had some trouble getting a stylesheet to work.  This was the styles.css at the root and the tutorial told me to merely add a link to the css in the index.html as usual:

<link rel=”stylesheet” href=”styles.css”>

I quickly found an SO post explaining that I had to add any stylesheet to the .angular-cli.json file, but when I checked, I did have a “styles.css” entry.  I still saw the missing file warning, and then saw on SO that I was not the only one to experience this.  Evidently, you can’t use css links in Angular html pages — the css files in the .angular-cli.json auto-magically get bundled and download to the client!

 

 

 

Advertisements

Written by Kevin Raffay

August 29, 2017 at 10:42 pm

Posted in Uncategorized

NPM Firewall Issue

leave a comment »

I recently came across an issue when trying to install json-server, a REST API mock tool.  I am using this library temporarily so that I can focus on Angular 4 development, and it serves the purpose of getting a fake back-end running quickly until I get my Web API tier working.

When running “npm install json-server –save”, I got a bunch of npm errors:

npm ERR! Windows_NT 6.3.9600

npm ERR! argv "G:\\Program Files\\nodejs\\node.exe" "G:\\Program Files\\nodejs\\node_modules\\npm\\bin\\npm-cli.js" "install" "json-server" "-- save"

npm ERR! node v6.11.1 npm

ERR! npm v3.10.10 npm

ERR! code UNABLE_TO_GET_ISSUER_CERT_LOCALLY

Usually, this would lead me to StackOverflow, but the answer was buried in a node.js thread on github.  I was behind my company firewall, and fortunately I did not have to deal with some SSL certificate, which is what some of the answers on the internet alluded to.  All it took was one command:

npm config set strict-ssl false

h/t Yehor Sergeenko

Written by Kevin Raffay

August 2, 2017 at 5:49 pm

Posted in Uncategorized

Angular CLI and Karma Naming Issue

leave a comment »

I found another gotcha with setting up my Angular CLI project per the instructions at https://www.sitepoint.com/angular-2-tutorial/  because the Karma unit test config file references the old Angular naming convention.  Without going too deep into the issue, I had to  change all instances of ‘angular-cli’ to ‘@angular/cli’.  For example, the old naming had to be changed in frameworks and plugins section:

frameworks: [‘jasmine’, ‘angular-cli‘],
plugins: [
require(‘karma-jasmine’),
require(‘karma-chrome-launcher’),
require(‘karma-remap-istanbul’),
require(‘angular-cli/plugins/karma’)
],

New format:

frameworks: [‘jasmine’, ‘@angular/cli’],
plugins: [
require(‘karma-jasmine’),
require(‘karma-chrome-launcher’),
require(‘karma-remap-istanbul’),
require(‘@angular/cli/plugins/karma’)
],

Written by Kevin Raffay

July 26, 2017 at 9:12 pm

Posted in Uncategorized

Setting up Angular 4 with Bootstrap

leave a comment »

I’ve been doing a lot of Angular development at my current contract, and there was a bit of a learning curve because of all the recent changes.  In the next few posts I’ll try to document the steps to get up and running and links to good resources that helped me get up and running.

I recommend starting with this Sitepoint post:

The Ultimate Angular CLI Reference Guide

 When you get to the point where you can see “app works!” in the browser, adding bootstrap support is not that intuitive.  First run this in the same folder that your angular-cli.json file is in:

 npm install --save bootstrap

 

You will then need to update the angular-cli.json file.  In the apps/styles node add a reference to the bootstrap css file:

"styles": ["../node_modules/bootstrap/dist/css/bootstrap.min.css","styles.css"],

 

To confirm that bootstrap is installed, run ng serve and you will see that your “app works!” text is no longer be rendered in a Times New Roman font, but in Arial.

 

 

Written by Kevin Raffay

July 18, 2017 at 6:58 pm

Posted in Uncategorized

How I Used SSIS to Save My Company a Ton of Money

leave a comment »

I recently implemented an SSIS solution to automate a complicated process that my company used to send out promotional emails.  The old process used to take a few days and involved several people – developers, DBAs, etc.  It was brutally tedious and complicated.  For those of you that don’t know, SSIS is Microsoft’s SQL-Server Integration Services, which used to be DTS (Data Transformation Services).

To summarize,each month  my company would send out a few million emails to former customers, trying to entice them to renew their memberships.  How these customers were selected was codified in a series of SQL scripts steeped in tribal knowledge and arcane business rules.  To top it off, each month these rules changed, based on the whims of a Vice President whose job it was to maintain the stream of recurring cash flow that kept the business running.  There was a constant effort to tweak the criteria and optimize the response rates of these emails, while not spamming or harassing former members who had no interest in renewing.

This process was ripe for automation, and SSIS beats ad-hoc SQL — most of the time.  Keep in mind that SSIS is just a tool, and as I always say: a fool with a tool is still a fool.  I would have to make sure that all this work would actually result in some ROI.  If my SSIS packages were brittle, unmaintainable, and cryptic, I would have replaced one faulty process with another, even more complicated one.

To top it off, once this data was collected, it would have to be approved by the VP, and if there was any doubt about the criteria, the whole process would be run again until the results were satisfactory to the powers that be.  For example, if last month’s run consisted of 2.2 million members, but this month’s run only had 2.19 million members, a junior developer would have to bird-dog this discrepancy and report a detailed explanation to the VP.  Did I mention that these issues had to be resolved in time to send the emails out on the first day of the month?  If the junior dev started the process at the end of the month and the numbers didn’t jive, guess what — she would be there late into the night of the last day of the month, running queries.

After seeing this junior developer data-diddling her career away (she dreamed of writing javascript and mobile apps, not running SQL scripts that took hours to complete), I decided to do something about this.  Over the course of a few weeks, I extracted all the arcane business rules embedded in thousands of lines of legacy SQL and codified them into a robust SSIS solution that Extracted the production data, Transformed it by applying the business rules used to identify valid data, and Loaded the data into CSV files — the classic ETL process.

My first challenge was to get away from running real-time queries against a production database to get the initial set of former members.  That process alone would take a developer hours to collect, validate, and eventually turn into CSV files that would be uploaded to our email provider.  I essentially designed a poor-man’s data warehouse, staging the production data into my sandbox using SSIS, and then having my SSIS packages call stored procedures to transfom it.  Oh, I know what you are saying — don’t you have a data warehouse or reporting team to do this?  Well, yes, but for reasons I can’t detail here, that team reported to a different manager and there were office politics involved.

Along the way, I learned a lot more about SSIS than I cared to (being a full-stack developer makes me hesitant to operate on one tier for too long) but it was a worthy effort.  What used to take days now takes hours — and that is just the time for the ETL process to run on the back-end.  It is not like some dev is sitting there, waiting for the results of a query to pop up in SSMS so that she can copy and paste them into Excel (yes, you heard correctly, she was taught to literally copy and paste the results into an Excel file and save that as a CSV).  I never understood why this was the process, and when I asked why, I was given the typical cargo cult response.  We always did it this way

For those that are interested, I summarized this project in a deck posted on Slideshare:

 

Written by Kevin Raffay

July 31, 2016 at 6:42 pm

Posted in Uncategorized

Basic CRUD Development With ORM/Telerik/WCF – Part 2 – ORM

leave a comment »

For our project, we can use the Lite version of LLBLGen Pro, which is limited to eight entities per project.

Here is a screenshot of the LLBLGen Pro project I used:

LPAGenPro

When working with the LLBLGenPro ORM, I found a need to generate not only the regular entities, but also objects called “Typed Views” to serve as DTOs (Data Transfer Objects).  Most ORM entities have extra metadata to manage state, such as “IsDeleted” properties.  This makes working with them powerful, but adds overhead to the payload in a Service Orientated Architecture (SOA).  I wanted to be able to use DTOs, or POCOs (Plain-Old-C#-Objects) as they are also known, between the service tier and the presentation layer.  This ORM allows me to generate these DTOs, saving me a lot of time.

The class diagrams below show the difference between a entity and a typed view:

GenProClasses

Eventually, I want to serialize the typed views into lightweight JSON objects.  Serializing entities creates a complex JSON structure that does not work well for a REST-full application, as we shall see.

Written by Kevin Raffay

August 24, 2015 at 7:00 pm

Posted in CRUD, LLBLGen Pro

Basic CRUD Development With ORM/Telerik/WCF – Part 1 – Database

leave a comment »

The LPA (Landing Page Admin) database is pretty basic, with eight tables, but I will only use four for now.

Here is the database diagram:

LPA_DB

Templates – Each Landing Page has a template, which contains metadata about the page.

GlobalTokens – Tokens stored as named-value pairs that apply to all templates.

TemplateTokens – Each template will inherit a set of global tokens.

Patterns – Patterns are the layouts that define the template (2-column, 3-column, etc.)

You can download the SQL script below to create these objects, but it may be easier to create a database called “LPA” and execute the part of the script that creates the objects.  There are also some other objects used by logging that are in the script.

 SQL Script on Dropbox

Create Database Script as Word Doc

Written by Kevin Raffay

August 16, 2015 at 2:14 am

Posted in CRUD, Microsoft

Tagged with ,