A random collection of notes and thoughts.

Creating a Contact form using Amazon Web Services

I recently wanted to add a simple contact form to my static website. When users submit the form, I wanted to receive their message via email. Amazon S3 hosts the site. I thought of three options to handle the dynamic behavior of parsing the form and sending me an email:

  1. Use a VPS to parse the form and send the email.
  2. Use a third party service.
  3. Use Amazon API Gateway and Amazon Lambda

Option 1 seemed like overkill. Option 2 and 3 both appealed to me. I eventually settled on using Amazon's services due to their low cost. The remainder of this post contains my notes about setting up the services. See the Amazon documentation for a more in depth look at each service.

Lambda Function

Create a new Lambda function with a NodeJs runtime. Use code like the following, modified from lithostec:

 'use strict';
var AWS = require('aws-sdk');
var ses = new AWS.SES({apiVersion: '2010-12-01'});

function validateEmail(email) {
var tester = /^[-!#$%&'*+\/0-9=?A-Z^_a-z{|}~](\.?[-!#$%&'*+/0-9=?A-Z^_a-z`{|}~])*@[a-zA-Z0-9](-?\.?[a-zA-Z0-9])*(\.[a-zA-Z](-?[a-zA-Z0-9])*)+$/;
if (!email) return false;
if(email.length>254) return false;
var valid = tester.test(email);
if(!valid) return false;
var parts = email.split("@");
if(parts[0].length>64) return false;
var domainParts = parts[1].split(".");
if(domainParts.some(function(part) { return part.length>63; })) return false;
return true;

exports.handler = (event, context, callback) => {
console.log('Received event:', JSON.stringify(event, null, 2));
console.log('Received context:', JSON.stringify(context, null, 2));
if (! {'We are sorry, an error has occurred. Please try again later.'); return; }
if (! {'Email is required.'); return; }
if (! || === '') {'You must provide a message.'); return; }
if (! || === '') {'You must provide your name.'); return; }
var email = unescape(;
if (!validateEmail(email)) {'You must provide a valid email address.'); return; }
var messageParts = [];
var replyTo = + " <" + email + ">";
messageParts.push("Message: " +;
var subject = "Contact Form Email - " + email;
var params = {
Destination: { ToAddresses: [ 'Your Name ' ] },
Message: {
Body: { Text: { Data: messageParts.join("\r\n"), Charset: 'UTF-8' } },
Subject: { Data: subject, Charset: 'UTF-8' }
Source: "Contact Form ",
ReplyToAddresses: [ replyTo ]
ses.sendEmail(params, function(err, data) {
if (err) {
console.log(err, err.stack);;
} else {
context.succeed('Thank you for contacting us!');

Your form's data will be in the JSON. This assumes you have Amazon SES setup.

Amazon API Gateway

Create a new API or use an existing one. Create a resource to handle the form submission. I named my resource "/submit-contact-form". Add a POST method. Select the new method and navigate to the Integration Request section.

Set the Integration Type to Lambda Function and set Lambda Function to your previously created function.

Expand Body Mapping Templates. Select "When there are no templates defined". Add a Content-Type of "application/x-www-form-urlencoded". Set the template to the following code courtesy of Marcus's Stackoverflow answer:

"data": {
#foreach( $token in $input.path('$').split('&') )
#set( $keyVal = $token.split('=') )
#set( $keyValSize = $keyVal.size() )
#if( $keyValSize >= 1 )
#set( $key = $util.urlDecode($keyVal[0]) )
#if( $keyValSize >= 2 )
#set( $val = $util.urlDecode($keyVal[1]) )
#set( $val = '' )
"$key": "$val"#if($foreach.hasNext),#end

Enable CORS unless you are using your own custom domain name for your API. Deploy the API.

Contact Form

For the HTML side, I had the Submit button of the contact form execute an Ajax request to the API with a function similar to this:

 function submitContactForm() {
type: "POST",
url: '',
data: $('#contact-form').serialize(),
success: function(message) {
if (message.errorMessage) {
} else {
error: function(message) {
dataType: 'json',
crossDomain: true

Bitmap sort from Programming Pearls

In the first chapter of Programming Pearls, the topic revolves around choosing the correct algorithm to sort a large file of random phone numbers. The discussion started by suggesting mergesort or a similar well know algorithm. But due to the restricted domain of inputs and the costly disk access back in the day, the author proposed the following solution:

  • Create a bitmap of size 10000000 initialized to 0.
  • Read the file and for each phone number, set bitmap[phone-number] = 1.
  • Iterate through the bitmap, and for each non-zero element write the index to the output.

This resulted in a sorted list that performed better than mergersort due to the fact of less disk access overhead.

The results made me wonder if the same performance gains would occur with today's fast reading SSDs. So I created the following Common Lisp script to compare merge sort, the default sort from SBCL, and bitmap sort:

 (alexandria:define-constant +max-phone-number+ 9999999)

(deftype phone-number () `(integer 0 ,+max-phone-number+))
(deftype phone-number-vector () '(vector phone-number))

(defun random-phone-number ()
(random +max-phone-number+))

(defun write-phone-number-file (file-path max-numbers)
(with-open-file (file file-path :direction :output :if-exists :supersede)
;; Use a hash table to ensure we do write any duplicate phone
;; numbers.
(let ((hash-table (make-hash-table :test #'equalp :size max-numbers)))
(loop repeat max-numbers
for phone-number = (random-phone-number)
unless (gethash phone-number hash-table)
do (format file "~A~%" (setf (gethash phone-number hash-table) phone-number))))))

(defun count-file-lines (file-path)
(with-open-file (file file-path)
(loop for line = (read-line file nil)
for count from 0
while line
finally (return count))))

(defun read-phone-number-file-to-vector (file-path)
(let* ((phone-number-count (count-file-lines file-path))
(phone-number-vector (make-array (list phone-number-count) :element-type 'phone-number)))
(with-open-file (file file-path)
(loop for line = (read-line file nil)
for index from 0
while line
do (setf (aref phone-number-vector index) (parse-integer line))))
(the phone-number-vector phone-number-vector)))

(defun default-sort (file-path)
(let ((phone-number-vector (read-phone-number-file-to-vector file-path)))
(sort phone-number-vector #'<)))

(defun do-merge-sort (phone-number-vector)
(declare (phone-number-vector phone-number-vector))
(if (or (null phone-number-vector) (= 1 (length phone-number-vector)))
(let ((half (truncate (/ (length phone-number-vector) 2))))
(merge 'phone-number-vector
(do-merge-sort (subseq phone-number-vector 0 half))
(do-merge-sort (subseq phone-number-vector half))

(defun merge-sort (file-path)
(let ((phone-number-vector (read-phone-number-file-to-vector file-path)))
(do-merge-sort phone-number-vector)))

(defun bitmap-sort (file-path)
(let* ((bitmap (make-array (list (1+ +max-phone-number+)) :element-type 'bit :initial-element 0))
(phone-number-count (count-file-lines file-path))
(phone-number-vector (make-array (list phone-number-count) :element-type 'phone-number)))
(with-open-file (file file-path)
(loop for line = (read-line file nil)
while line
do (setf (aref bitmap (parse-integer line)) 1)))
(loop with phone-number-index = 0
for x across bitmap
for bitmap-index from 0
when (= 1 x)
do (setf (aref phone-number-vector phone-number-index) bitmap-index)
and do (incf phone-number-index))

And the results:

 WB-SCRATCH> (time (progn (default-sort "/tmp/phonenumbers")
Evaluation took:
1.395 seconds of real time
1.474052 seconds of total run time (1.469602 user, 0.004450 system)
[ Run times consist of 0.008 seconds GC time, and 1.467 seconds non-GC time. ]
105.66% CPU
3,347,659,604 processor cycles
96,823,648 bytes consed

WB-SCRATCH> (time (progn (merge-sort "/tmp/phonenumbers")
Evaluation took:
2.017 seconds of real time
2.108636 seconds of total run time (2.108636 user, 0.000000 system)
104.56% CPU
4,840,574,542 processor cycles
369,787,504 bytes consed

WB-SCRATCH> (time (progn (bitmap-sort "/tmp/phonenumbers")
Evaluation took:
0.476 seconds of real time
0.477942 seconds of total run time (0.477942 user, 0.000000 system)
100.42% CPU
1,141,716,964 processor cycles
98,101,856 bytes consed

So it looks like the same approach using today's SSDs still results in a performance gain. The merge sort implementation did use a lot of memory as it made new arrays on every call, so that has room for improvement. However the fact that bitmap sort beat the default implementation's sort proves the general thesis.

Seating for a wedding

In two weeks I will be getting married and my future wife and I are scrambling to make the final arrangements for the big day. Part of this involves figuring out the seating for the wedding. We have 115 people to sit and each table fits up to 8 people. Of course we wanted to sit families together, even if it meant having more than the minimum number of tables necessary. We needed to determine how many tables were needed and how many would sit at each table. For example, if we chose 15 tables a couple of seating options would be

 6 6 7 8 8 8 8 8 8 8 8 8 8 8 8 


 6 7 7 7 8 8 8 8 8 8 8 8 8 8 8 

The total of both those options equals 115.

We wanted to see what our options were and I quickly realized we had a permutation problem. But why struggle through this menial task when it is more suitable for a machine? I took this opportunity and turned to Common Lisp to compute the answers.

I decided to make use of the Common Lisp Screamer library which provides constraint programming for Common Lisp. The code to compute the permutations for 15 tables is as follows:

 (defparameter *number-of-people* 115)
(defparameter *maximum-number-of-people-per-table* 8)
(defparameter *minimum-number-of-people-per-table* 6)

(defmacro screamer-create-list-of-ints (num)
`(list ,@(loop for i from 1 to num collect
'(screamer:an-integer-between *minimum-number-of-people-per-table*

(let* ((vars (screamer-create-list-of-ints 15)))
(screamer:assert! (screamer:applyv #'<= vars))
(screamer:assert! (screamer:=v *number-of-people*
(screamer:applyv #'+ vars)))

The macro to create the variables was necessary because for some reason using a loop at execution time to create the variables hung the program.

After creating a list of integer variables, we make a couple of assertions. The first is that the list should be in ascending order. The second is that the sum of the list should equal the number of people attending our wedding, 115 in this case. The screamer function "all-values" will execute the code until all solutions are found.

The results are as follows:

 15 tables:

6 6 7 8 8 8 8 8 8 8 8 8 8 8 8
6 7 7 7 8 8 8 8 8 8 8 8 8 8 8
7 7 7 7 7 8 8 8 8 8 8 8 8 8 8

16 tables:

6 6 6 6 6 6 7 8 8 8 8 8 8 8 8 8
6 6 6 6 6 7 7 7 8 8 8 8 8 8 8 8
6 6 6 6 7 7 7 7 7 8 8 8 8 8 8 8
6 6 6 7 7 7 7 7 7 7 8 8 8 8 8 8
6 6 7 7 7 7 7 7 7 7 7 8 8 8 8 8
6 7 7 7 7 7 7 7 7 7 7 7 8 8 8 8
7 7 7 7 7 7 7 7 7 7 7 7 7 8 8 8

17 tables:

6 6 6 6 6 6 6 6 6 6 7 8 8 8 8 8 8
6 6 6 6 6 6 6 6 6 7 7 7 8 8 8 8 8
6 6 6 6 6 6 6 6 7 7 7 7 7 8 8 8 8
6 6 6 6 6 6 6 7 7 7 7 7 7 7 8 8 8
6 6 6 6 6 6 7 7 7 7 7 7 7 7 7 8 8
6 6 6 6 6 7 7 7 7 7 7 7 7 7 7 7 8
6 6 6 6 7 7 7 7 7 7 7 7 7 7 7 7 7

18 tables:

6 6 6 6 6 6 6 6 6 6 6 6 6 6 7 8 8 8
6 6 6 6 6 6 6 6 6 6 6 6 6 7 7 7 8 8
6 6 6 6 6 6 6 6 6 6 6 6 7 7 7 7 7 8
6 6 6 6 6 6 6 6 6 6 6 7 7 7 7 7 7 7

Updated Site

I got tired of the design of my old site so I decided to start anew. What started out as some simple adjustments of a Wordpress template turned into creating a homemade blogging engine (what self respecting web developer hasn't created their own?).

The blogging engine I created is extremely simple. It is written in Common Lisp and generates static html files for the site. Every entry consists of a file that I drop into the appropriate directory. Each file has a timestamp and other properties in their header. The blog engine reads the entry directories, applies the entries to the site's design templates, outputs the html files, and then I simply rsync the site with the production server.

I decided to leave comments off. The signal to spam ratio far too low. Please see my contact page if you wish to get in touch.

Right now the site does not have any pagination for each of the sections. I think the pages are small enough so that it isn't an issue, but I may change that in the future.

I also left out any social media links. To me, it's easy enough to share a link, so adding those buttons just becomes noise on the site. I wanted the layout to be clean and content oriented.

Common Lisp Hyperspec in ebook format

The Common Lisp Hyperspec is freely available for download. It is great to have at your fingertips when programming, but I also wanted to read it on my couch rather than glaring at my computer screen for hours on end.

The spec is available to purchase as a pdf, but posts here and here indicate that the quality is sub-par. There is also a PS version. However, I wanted something that could be read on my Kindle DX without having to zoom in and out all the time. The best way I found was to take the Hyperspec and convert it using Sigil. Since this may be against the terms and conditions of the Hyperspec, I will not publish the resulting file.

Converting the Hyperspec using Sigil was pretty straightforward. With Sigil open, I first added all the files in the Hyperspec's Front folder under Sigil's Text folder. Then I added all of the Hyperspec's Body folder (this will take a minute or two, so be patient). Then just save and you will have a epub version. I then used Calibre to convert and upload to my Kindle.

The resulting epub file is not perfect. The glossary doesn't seem to be present and each chapter's Dictionary section was moved to the end of the file. Also many of the links do not work.

If anyone takes the time to fix these issues please let me know! But for now this version will suffice for my needs.

The follies of Online shopping

I limit the number of TV shows I follow at any given time to one. I feel watching TV is abused way to often in the US. When people finish their chores and come by the special gift of free time, a whole world of opportunities opens ahead of them, opportunities to create, to learn, to help and extend the knowledge of mankind. But instead of seizing this precious opportunity, they plop on the couch with a bag of salty potato chips and fry their brain watching under garment commercials.

This is a waste of life. Have you ever watched someone watching TV? Try recording yourself watching TV for one hour. What you see is a thoughtless person rotting away. Of course then you realize you watching yourself watching TV is the same as you watching TV and your brain starts spinning. Recursion!

But TV does have its place. The best time to watch TV or movies is when you are eating. Eating and reading at the same time is awkward and frustrating, and its more difficult to fully immerse yourself in the material while you use brain cycles to chew and poke at food. So at dinner time, my girlfriend and I routinely watch TV shows we both agree on. And the latest show we both became hooked on is House.

We watched the first two seasons in a little over the month. When the time came to purchase the third, we went online to check the prices. We found Barnes and Noble had season three for only $29. The nearest Barnes and Noble was about twenty minutes away, and we had to go into town anyways, so we decided we would pick it up in store rather than waiting for it to ship.

We found season three at the store, but to our surprise it was priced exorbitantly at over $40! We then went to one of their computers and confirmed that the online price was indeed only $29. So I found the nearest employee to inquire about the price difference.

The insipid employee wasn't much help. Apparently stocking fees are why the cost is $10 more in the store. The employee said he could order it online for us right there and it would arrive at our house, but we couldn't take the one sitting at the shelf for $29. I became a bit fervid, but I realized the employee was just a policy following peon. Those Barnes and Noble polices are still an enigma to me today.

We ordered season three online later that night from Amazon rather then Barnes and Noble, a small rebellious victory on my part. Two days later I receive an email with the package tracking number. To my surprise, it shipped from the same town as Barnes and Noble, some twenty minutes away! As the week wore on, I continued following the package, perplexed about why it was taking so long. Apparently FedEx shipped it to their location in our town. Then the postal service picked it up at the FedEx location. After bringing it to the postal service's distribution center, they finally delivered it, and of course we missed the delivery and had to wait 30 minutes in line at the post office the next day.

All in all, it took over a week for season three to ship to us, even though it was twenty minutes away. If FedEx notified us and gave us the option to pick it up in the first place, we would of done that in a heart beat. Of course I'm sure they have their policies too.

Rant over!


Welcome to my new blog! You can access my old blog here: