Blog

How We Built smapiot.com

Published on 2019-10-19.
A recap on how we designed and built "smapiot.com" from scratch to reflect our principles and our values.

How We Built smapiot.com

In June, we redesigned our website from scratch. For us it was an important step on the roadmap for realizing our five year vision. As a result, we spent quite some time at the drawing board discussing what makes sense and what should be discarded.

We started with the simple question: What do we stand for? Finding the core principles and values of smapiot helped us to craft the website in the same spirit.

The Content

Page Content

Obviously, we had some sections in mind that were somehow mandatory. Most notably, we wanted to allocate room for showing some of our previous and current projects. We also required space to present our consulting services. As far as new additions are concerned, we knew that our open positions (or general job information), as well as our team has to be presented somewhere.

Before we went into any design considerations, we - as a team - discussed what we like about the existing website. What should be kept? What has to be kept (e.g., due to legal constraints)? What should be omitted? Once we figured out what we would like to see, it was time to create a design proposal.

The Design

There are three key points that we tried to follow: Our website design should fit our corporate design guidelines, it needs to be minimalistic yet professional (after all we are no designers, but we care about great design and usability), and it must be fast such that it can be perceived without much friction.

Using Adobe XD, we created a small design prototype of the website that we discussed and iterated online.

Designing the prototype

An important aspect of the specific design proposal was to be able to scale easily. As such a mobile-first design was not necessary, however, implicit. Any part of the page should be - from the design language - transferrable to any screen size. We did that by evaluating not the pages as a whole, but rather the individual (design) building blocks. When using for example a tile, that tile must have a specification to tell us the behavior in any viewport dimension.

A direct consequence of our focus on the content presentation is the high importance of fonts and text in general. We set some rules, which had to be followed by every component - in every composition.

  • Have a maximum of 80 characters per line
  • Padding of minimum 15px
  • Line height should be roughly 1.4
  • The font-size has to be at least 16px
  • Key-content should be highlighted (e.g. bold)
  • Maximum 2-3 fonts per page (one simple font for text, one more complicated font for headlines, etc.) - less is more
  • Maximum 2-3 colors for text per page (not including shades)
  • Always use labels with the icons (unless icons are brands, then its mostly self-explanatory)
  • Do not use conflicting icons (should be 100% clear what they are)

Possible Technologies

Web Technologies

In order to find a suitable implementation, we first had to identify what we actually want to deliver. As defined in our principles, our website should be as fast as possible. Since smapiot usually works with modern technologies, it was clear that the technology for our website had to be state-of-the-art as well. We determined that our target users will definitely have JavaScript enabled. We also concluded that the initial rendering is not much impacted if its purely performed on the client-side.

We investigated two opposite directions for our website: Either to use a static-site generator (SSG) or to go all-in on a single-page application (SPA). We evaluated the two options in separate proof of concept (PoC) implementations first. While the SSG PoC was done with Jekyll, we've chosen React to implement a PoC for an SPA solution.

We used the following metrics for our final judgment:

AreaReact (SPA)Jekyll (SSG)
User Experience+0
Load Time+++
Admin Experience++
Content Management+++
Developer Experience++0

Obviously, both approaches would have given us a good solution. The flexibility and simplified development was reason enough for us to chose the SPA approach.

There was also a middle ground solution called Gatsby. We evaluated this project carefully and concluded that the advantages (slightly better performance if done right, non-JS accessibility) are overshadowed by the implied complexity. Every simple thing had to be done via plugins and some (for us important) plugins, e.g., TypeScript, either had severe bugs or did not work at all.

Implementation Details

Going for an SPA made sense, since it allows us to still scale and modify our homepage fast. We created a small continuous integration (CI) pipeline to ensure continuous delivery (CD) after accepting pull requests with code changes. An important factor was that we could make extensive use of bundle splitting to only load the components relevant for displaying the current page.

In the end the individual scaling was possible with React in combination with Parcel. Using a plugin called Codegen, we can generate the (lazy loading) bindings for all pages without writing them explicitly. With this plugin, we can create codegen modules, which emit dynamically created modules on the fly.

For our blog posts the generated module looks similar to the following code snippet:

const { parseExpressionAt } = require('acorn');
const { join, relative, basename } = require('path');
const { readdirSync, statSync, readFileSync } = require('fs');
const { detectMeta, extractMeta, blogPath, pageId, latest } = require('@smapiot/website-helpers');

function getMeta(page) {
  const result = detectMeta(page.content);

  if (result) {
    const expr = parseExpressionAt(page.content, result.offset);
    const meta = page.content.substring(expr.start, expr.end);
    return eval(`(${meta})`);
  }

  return {};
}

module.exports = function() {
  const dir = join(__dirname, '..', 'pages', 'blog');
  const pageDetails = readdirSync(dir)
    .map(m => join(dir, m))
    .filter(m => statSync(m).isFile() && m.endsWith('.tsx'))
    .map(m => ({
      path: m,
      defaultRoute: blogPath(dir, m),
      id: pageId(m),
      content: readFileSync(m, 'utf8'),
    }));

  const pageMetas = pageDetails.map(page => ({
    id: page.id,
    route: page.defaultRoute,
    ...getMeta(page),
  }));

  const posts = pageMetas.map(extractMeta).sort(latest);
  return `module.exports = ${JSON.stringify(posts)};`;
};

We also use codegen modules to be able to retrieve image references (or any kind of media) much simpler than with a require() call. This is how (part of) our shared media module looks like:

export interface Files {
  [name: string]: string;
}

export interface Images {
  clients: Files;
  members: Files;
  plans: Files;
  recruiting: Files;
  services: Files;
  smapiot: Files;
  teasers: Files;
  technologies: Files;
}

export const images: Images = require('../generators/media.codegen');

As a result, any module can just import { images } from './media' and then when needed include a images.technologies.react reference. The rest is just resolved as expected.

Previously, we introduced the concept of individual design elements called components. These components are translated nicely into React components. For instance, for our carousel we tried to come up with a solution that works well on mobile devices and desktops, is fast and does not contain unnecessary code. We ended up shipping our own version, which has more implementation details described at LogRocket's blog.

All our React components are functional components that make extensive use of hooks when requiring state. In such cases we distinguish between view components (do not come with state or logic) and logic components (do not come with a view). In general, such an architecture makes sense, especially when thinking about reusability.

As far as the carousel component is concerned, the actual carousel component is headless (i.e. no view) and manages only what slide is shown, and how to place all slides including offsets for the desired view.

Carousel component

In most parts of our code our web app is the standard React app that you would expect. For instance, our app definition looks as follows:

const app = (
  <BrowserRouter>
    <Route component={ScrollToTop} />
    <Route component={AppInsightsTracker} />
    <Routes pages={getPages()} />
  </BrowserRouter>
);

The Routes component takes the pages that have been lazy loaded and have been bound via the codegen module as described earlier. The two route components trigger on every page change - the ScrollToTop just scrolls to the top (unless backward) and the AppInsightsTracker gives us telemetry information for improving our page.

On top of the actual coding, we required also some infrastructure changes to fully support the SPA model. Most importantly, we configured the .htaccess file for our Apache webserver to use rewriting.

This is the essence of our configuration:

RewriteEngine On
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

RewriteBase /
RewriteRule ^index\.html$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule . /index.html [L]

ErrorDocument 404 /

In essence, we always want to use HTTPS. And, if a url with a subpath is invoked explicitly, the request is redirected to the index.html file located in the root directory. This file then serves our SPA, which routes to the content related to the subpath.

Conclusion

Rebuilding our web app has been a wonderful journey. We started with a simple concept, which then became a design craft and a content discussion. Implementing different PoCs and evaluating them against the metrics helped to make the right choices. In the end we are satisfied with the result.

Lothar Schöttner
Lothar SchöttnerFounder and CEO