Sulby WordPress theme

December 15, 2017

#CSS #HTML5 #JavaScript #jQuery #LEMP #MySQL #NGINX #PHP #Redux Framework #Sulby Theme #thenakedbrunch.com #WordPress

the naked brunch
thenakedbrunch.com

Sulby is an in-house standalone WordPress theme developed by bunhill.com. Designed and built from scratch to be lightweight, fast, secure and efficient, the theme is ideally suited to many typical small business and / or presentation requirements.

The theme works effectively on both phones and larger devices and can be quickly tailored and styled to a client specification. Written in PHP, jQuery, HTML5 and CSS all of the required functionality is included within the theme itself – rather than depending on additional or potentially insecure and poorly documented third-party plugins which may be poorly maintained over time. Total knowledge of everything included in the theme offers significant advantages with respect to quickly and effectively meeting specific client requirements. Deliberately stripped back and bare-bones, there is none of the superfluous bloat and legacy code often found in commercially available themes. The aim has been to include only what is actually required – to create a flexible framework into which additional specific functionality could then be added as required.

Sulby WordPress theme
Sulby WordPress theme

This site has been built using the theme and there is a more comprehensive demo at thenakedbrunch.com. The Naked Brunch is an imaginary restaurant / café business with 3 branches. The site runs on a LEMP (Linux, Nginx, MySQL, PHP) stack and is hosted by bunhill.com at Digital Ocean. HTTPS (SSL/TLS) certificates are via Let’s Encrypt. In most cases this combination is also likely to be the best client hosting solution since the setup has been carefully optimized and tested. A site based on this theme and code base could however also be easily implemented on existing or alternative hosting elsewhere.

The theme incorporates the Redux Framework which enables us to quickly customize the use of typography – Google Fonts are available as standard enabling the look and feel of a site to be quickly changed (fig 1). Additionally the backend administration dashboard can be customized – such that, for example, functionality can be simplified for clients to include only what they actually need (fig 2). At the thenakedbrunch.com, for example, the client-user would need to be able to edit the restaurant menus and upload new pdfs (fig 3). In this example the ‘Restaurant Menu’ entity is a Custom Post Type defined in functions.php.

typography options
fig 1
client dashboard
fig 2
client dashboard
fig 3

NGINX configured with FastCGI caching

December 12, 2017

#Caching #FastCGI Caching #loader.io #NGINX

“Nginx includes a FastCGI module which has directives for caching dynamic content that are served from the PHP backend. Setting this up removes the need for additional page caching solutions like reverse proxies (think Varnish) or application specific plugins. Content can also be excluded from caching based on the request method, URL, cookies, or any other server variable.”

How to Setup FastCGI Caching with Nginx on your VPS – DigitalOcean

So it turned out that there is just a little bit more involved when multiple domains are hosted on the same server – in particular the fastcgi_cache_key directive will not want to be duplicated across multiple Virtual Hosts. The Virtual Host files are read in alphabetical order – so it can either be included in the first or else moved to nginx.conf.

It seems to work very well. Here is thenakedbrunch.com hosted on a $5 per month droplet @ DigitalOcean (referral link) handling 10,000 hits over 1 minute tested using loader.io. Granted this is not a definitive test of how the server would necessarily respond under real world conditions. But it’s a great start – no errors, no timeouts. View the report in more detail here at loader.io.

loader.io
10,000 hits over 1 minute

Curation vs Search & Less Is More

December 2, 2017

#metadata #stock photography #stocksy #thenakedbrunch.com

Magnum Photos, Old Street, London, August 1991
Magnum Photos, Old Street, London, August 1991

The summer of 1991, researching for my degree dissertation, I interned at Magnum Photos’ London bureau. That was the first time I consciously thought about the importance of metadata – captions, keywords, categories etc – the who, why, when, where, what and how. There was a fairly rudimentary catalogue structure in that pre-digital, pre-search era. The collection comprised drawers of transparencies and prints in labelled boxes. Each image individually captioned. Often the same images would be duplicated under different labels. For example, there was a photo of Fidel Castro with Ernest Hemingway in the box labelled ‘Cuba’. The same image also in another box labelled ‘Beards. But it would have been impossible to have a box for every potential keyword and finding the right image for a client was really about the talented team of Picture Researchers knowing very well the relatively small and very special curated collection.

Royalty Free stock is at the other end of that business and is typically about simple visual metaphors – the business handshake, the team etc. Crowdsourced RF stock photography took off in the 00s when DSLR cameras became affordable and with the arrival of ubiquitous broadband. There was a huge demand for cheap stock images. Some of my own simple images made tens of thousands of dollars each. The new online only agencies were quickly able to undercut the established giants (Getty Images ended up buying iStockphoto in 2008). Shooting new contemporary looking digital images was cheaper than laboriously scanning and captioning dusty old film. Licensing images for relatively little money made sense when the photographers could make it up on volume. But within a very few years there was a significant oversupply of images. The agencies could still shift volume but photographers found it increasingly difficult to compete. Oversupply drove prices ever lower and many of the agencies began to switch to subscription based models in an attempt to lock in revenues – especially as growth began to slow.

Crowdsourced stock at the big sites is often more or less a free for all today. There is relatively little curation and the photographers typically supply their own, often very spammy, metadata. It can now be difficult to find the right stock image at the big sites. Metadata is useless when so many images have the same keywords. A search for “brunch” at Shutterstock, for example, currently returns over 200,000 photos as I found when I was looking for a few images for the front page of The Naked Brunch. Though many, even at the front of the search results, are completely irrelevant. Keyword search seems increasingly less useful when content is not also curated. No algorithm has fixed that so far and search results also often seem to be self perpetuating since most users will not search more than a few pages deep. In the 00s agencies would boast about how many images they had – today less really is more.

I was quickly able to find the right images at Stocksy which is a co-operative based in Canada, which takes a refreshing alternative. A limited number of photographers supply carefully curated content. Typically that means much better search results. From a buyer perspective it means not having to sift through thousands of irrelevant images. These are released images for business uses, it is not an editorial agency. But there is a refreshing authenticity and style about the collection. It’s a return to quality over quantity.

The Naked Brunch
thenakedbrunch.com

Google Analytics Referrer Spam

November 29, 2017

#Bot Traffic #Ghost Spam #Google Analytics #Referrer Spam

“The technique involves making repeated web site requests using a fake referrer URL to the site the spammer wishes to advertise. Sites that publish their access logs, including referrer statistics, will then inadvertently link back to the spammer’s site. These links will be indexed by search engines as they crawl the access logs, improving the spammer’s search engine ranking. Except for polluting their statistics, the technique does not harm the affected sites. At least since 2014, a new variation of this form of spam occurs on Google Analytics. Spammers send fake visits to Google Analytics, often without ever accessing the affected site. The technique is used to have the spammers’ URLs appear in the site statistics, inducing the site owner to visit the spam URLs. When the spammer never visited the affected site, the fake visits are also called Ghost Spam.”

Referrer Spam – Wikipedia

One of our clients is asking us about strange results in the Google Analytics report we send her every week. There are lots of referrals which seem to make no sense. Can we sort it out for her?

It’s a low traffic site and it turns out that something like 50% of apparent visits are actually ghost spam – fake traffic which is visible only in Google Analytics and which is created not by actually visiting the site but by sending data directly to the account via the measurement protocol. Because these are not real site visits there is no point in creating a server rule to block this traffic (e.g. in .htaccess). If we quickly create a report including all referrals and excluding all valid hostnames then we end up seeing this kind of thing:

Ghost Spam
Ghost Spam

It’s fake referrer URLs typically designed to encourage us to visit those sites. Sometimes the domains have been created to look like well known internet sites by using homographs – similar looking unicode characters. Other times the domain will be real. Sometimes messages are included in the data. Motherboard has an article about this issue which relates to their own domain being used in referrer spam: We’re not spamming your Google Analytics – Motherboard.

On sites where traffic matters this can be an issue as it negatively affects the statistics which clients and advertisers care about (though not, typically rankings). The obvious discrepancy here which we can see from the report is that the hostnames are not ours – our website is not serving these apparent hits. So that’s easy enough to filter from the client report. We start by creating a duplicate view and then create a filter to include only validated hostnames – expressed as a simple regex – e.g. www\.example\.com|example\.com

Valid Hosts Only
Valid Hosts Only

While we are it – we can go into View Settings and make sure that that “Exclude all hits from known bots and spiders” is checked. And have a look at Campaign Source.

1 2 3 4 5