For the last forty years, television has evolved fairly slowly. That might raise eyebrows in this age of on-demand content, but it’s important not to confuse streaming and binge watching with the technical process of creating television programming. Now, just as 40 years ago, cameras capture two-dimensional pictures that are projected onto a rectangular screen in our living rooms. Aspect ratios have improved and picture quality is far superior, but the passive nature of our viewing experience remains similar to the 1970s.

Imagine a world where civic life is transformed by a perfect storm of converging technologies – most notably ubiquitous high-speed cloud and smartphones – a slew of Internet-connected sensors, controllers and other devices (the Internet of Things), plus intelligent tools for aggregating and analyzing the huge volumes of big data that these devices will generate, often in real time. This is the idea behind smart cities.

The cloud is often portrayed as a perfect solution to IT concerns. From affordability to accessibility, its merits are championed by many. However, the cloud isn’t a nebulous theoretical concept – it involves expensive hardware and requires ongoing maintenance and supervision. There’s no doubt that cloud services can represent a simple and reassuringly robust alternative to in-house platforms, but sometimes there’s a fiscal sting in the tail which comes in the form of bandwidth costs.

Welcome to part 3 of our series on securing a web directory in Apache using the .htaccess file.

In part one we looked at what the .htaccess file is and how it works. In part two we looked at the auth directive and how it can be used to provide authentication level access restrictions to a website directory. In case you haven’t yet, I’d recommend reading over the previous parts before continuing with this one.

In this article we will be looking at using the .htaccess file to either block or allow access based on where the request comes from.

Things were better in the old days, the purists always say. Gas lamps were better than sodium lighting, vinyl gave a purer sound than CDs, and computer games were more enjoyable when gameplay took precedence over graphics. The latter argument certainly has some merit – some of today’s PS4 and Xbox One titles do seem more focused on providing pixel-perfect backdrops than linear difficulty curves. But is it true that modern gamers are missing out on the sheer playable pleasures of old school titles?

YouTube may be a great way to watch new music videos or pass an idle ten minutes at work, but it also provides some surprisingly valuable marketing lessons.

This post is part of our Getting Started With Linux series, when we address everything from Distributions to Kernel Panic. To read more from this series click here

One of the more important tasks when managing a dedicated or virtual private server is to be aware of what it is doing and how it is performing. A poorly performing or overloaded server isn’t just a problem for you, it can mean frustration for your users, and potentially lost revenue from any ecommerce sites this may affect. In previous articles, I’ve covered how you can use top and htop to monitor your system processes, CPU usage and RAM usage. These tools don’t provide a lot of help when you have issues with network throughput though, which is where iftop comes in handy.

At 100TB, we are constantly scouting out the next best place to provide lightning fast services at all hours of the day and night. We know how important location is to our clients, and we are excited to announce our second data center in Dallas, TX.