Securing your website comes down to just 2 important functions.
1 Don't have security vulnerabilities.
Your site can only be hacked if there are known or identifiable security flaws on it. Very few people create websites or install plugins that they know are going to cause a site hack. Very few developers knowingly create security flaws, so how do security holes show up on our websites?
Security is a cat and mouse game. Millions of software developers are creating new code every single day. Some of that code has security holes, and some of it gets out on production servers. Eventually, someone discovers a flaw. The race begins between your site being randomly targeted and a patch being developed and installed on your server.
With over 350 million active websites, it would be impossible for anyone to find every vulnerable website to hack. Most sites are way small ever to be targeted. That is why hackers write programs to do it for them. A single server can scan 350 million sites in about two months. And running 60 servers in the cloud is so simple; any hacker who wants to could search the entire Internet and compromise every vulnerable system in a single day.
To exploit a known vulnerability on your website, a hacker only has to win this race once. To keep your site secure, you must win this race every time. Eventually, even websites that auto-patch will lose this race as hackers exploit unknown and unreported vulnerabilities.
2 Identify hacking attempts and block them.
Website security software tries to identify hacking attempts. This is difficult because many attacks, such as uploading malicious files, use legitimate web features - just by unauthorized users. It can be almost impossible for firewalls to tell the difference. So website firewalls create signatures for each newly identified security flaw. This puts us in the same race-against-time as patching.
It is almost impossible to identify all possible "bad" web traffic. This is why so many websites with security software installed continue to suffer from devastating site hacks and redirect attacks.
99% of all web hacks use automated tools targeting random sites on the internet. Correctly identifying automated requests can stop nearly every website hack.
There are 2 types of traffic we mustallow. Web browsers (visitors using chrome, firefox, safari, etc.) and web crawlers like Google-bot, Bing-bot, SEO-Moz, and other helpful robots. Each request has a unique User-Agent that identifies it. This User-Agent can be easily set to anything and is never set to "evil-hacking robot."
These 2 protections allow all of our mobile and desktop clients and good robots to access our site. By only allowing verified clients, we prevent any hacking tools from accessing the site.
Find out the best tricks and tips to secure your website.
From us to your inbox weekly.