Information security, protection and secure administration of web resources. Benchmarking Web Application Security Scanners Scanner Type: Injection Automation Tool

The best web services with which you can investigate sites for vulnerabilities. HP estimates that 80% of all vulnerabilities are caused by incorrect web server settings, outdated software, or other issues that could have been easily avoided.

Services from the overview help to identify such situations. Typically, scanners scan against a database of known vulnerabilities. Some of them are quite simple and only check for open ports, while others are more thorough and even try to perform SQL injection.

WebSAINT

SAINT is a well-known vulnerability scanner, on the basis of which the WebSAINT and WebSAINT Pro web services are based. As an Approved Scanning Vendor, the service performs ASV scanning of websites of organizations for which this is required under the terms of PCI DSS certification. Can work on a schedule and conduct periodic checks, generates various reports on the results of scanning. WebSAINT scans TCP and UDP ports on specified addresses on the user's network. The "professional" version adds pentests and web application scanning and customizable reports.

ImmuniWeb

The ImmuniWeb service from High-Tech Bridge uses a slightly different approach to scanning: in addition to automatic scanning, it also offers manual pentests. The procedure starts at the time specified by the client and takes up to 12 hours. The report is reviewed by company employees before being sent to the client. It indicates at least three ways to address each identified vulnerability, including options for modifying source code web applications, changing firewall rules, installing a patch.

You have to pay more for human labor than for automatic verification. A full scan with ImmuniWeb penetration tests costs $639.

BeyondSaaS

BeyondSaaS from BeyondTrust will cost even more. Customers are offered to subscribe for $3,500, after which they can conduct an unlimited number of checks during the year. A single scan costs $700. Sites are checked for SQL injections, XSS, CSRF and operating system vulnerabilities. The developers say it's possible false positives no more than 1%, and the reports also indicate options for correcting problems.

BeyondTrust offers other vulnerability scanning tools, including the free Retina Network Community, which is limited to 256 IP addresses.

Dell Secure Works

Dell Secure Works is arguably the most advanced of the web crawlers in this review. It is powered by QualysGuard Vulnerability Management technology and checks web servers, network devices, application servers and DBMS both within the corporate network and on cloud hosting. The web service is PCI, HIPAA, GLBA and NERC CIP compliant.

There are a lot of options for attacks on a web resource, as well as the consequences of these attacks. And as always, there are only two goals - fame with a banal joy from showing one's own capabilities, and the ubiquitous benefit, manifested in the form of direct or indirect material gain, in other words, money. So, what is the threat? Here is an example of the most common attacks on websites:

  • Website homepage spoofing is one of the most common forms of hacking. Instead of the usual content on the cover of the site, anything will show off - from the name of a malicious hacker to banal insults.
  • Removing the file system - all information simply disappears, which becomes a failure if there is no saved copy of the resource. It is worth noting that the database of client passwords, as well as other data of critical value, may also be lost.
  • Substitution of information - attackers can replace the phone or other data of the organization. In this case, your clients automatically become clients of attackers.
  • Accommodation Trojans- in this case, most likely you will not notice the visit of a hacker, at least everything will be aimed at this. Malicious programs can perform a variety of functions, such as redirecting malicious users to a website, stealing customer personal data, infecting visitors with viruses, and so on.
  • Sending spam - your site can be used to send spam, in which case your "real" correspondence will not reach the addressee, since your organization's domain will be entered into the centralized database of spammers almost immediately.
  • Creating a high load - sending deliberately incorrect requests to the web server address or other actions from the outside, which will result in difficulty accessing the site or a crash in the server's operating system. This type of attack is very widespread on the Internet.

The result of all these types of attacks is not only a temporary cessation of the resource, but also a loss of trust in the website in the eyes of customers. A user infected with malicious code on your resource, or redirected from your site to a site of dubious content, is unlikely to ever dare to type your address in the browser again.

What to do?

The issue of website security can be asked already at the development stage. There are many CMS systems(Content Management System - content management system), which is a template that simplifies the management and development of the site. The whole range of CSM systems can be divided into open (free) and proprietary. Among the open ones, we can distinguish Drupal, Mambo, Joomla and Typo3, among the paid ones - 1C-Bitrix, NetCat, Amiro.CMS. All of them are more or less safe, have a number of advantages and disadvantages. So which CMS should you choose? Of course, this issue remains under consideration in each specific case, however, statistics show that in Russia the vast majority of web studios that use third-party development to create sites use the 1C-Bitrix product. There are a number of factors behind this:

  • By merging with 1C, Bitrix has unofficially become the national standard for CMS-based web development.
  • 1C-Bitrix has a security certificate from Positive Technologies (which will be discussed later), confirming the system's invulnerability to all types of known attacks on web applications.
  • 1C-Bitrix is ​​currently the most promising CMS system on the Russian market, showing the best growth rate.
  • The functionality of the product is enough to create complex corporate sites, information and reference portals, online stores, media sites, as well as to create almost any other type of web resources.

Creating sites based on 1C-Bitrix, as well as transferring existing resources to the product engine, is one of the options for solving a number of security problems, primarily vulnerability issues, which will be discussed later.

The site has already been created - is it vulnerable?

Checking an existing web resource for a vulnerability is a very time-consuming task. The process is not limited to direct scanning - the site still needs to be reworked, holes plugged, and a number of issues will have to be resolved on the provider's side. So, vulnerability scanners.

Vulnerability Scanners- this is special programs designed to analyze network security by scanning and probing network resources and identifying their vulnerabilities. Simply put, the scanner looks for typical security holes and holes, thus making life easier not only for website owners, but also for hackers. All vulnerability scanners can be classified depending on the method of work into 3 groups:

  • Local - installed directly on the node being checked and provide high reliability. Working on behalf account with maximum privileges and use only one method of searching for vulnerabilities - comparing file attributes.
  • Passive - they use network traffic as a data source, however, unlike network traffic, they allow minimizing the impact of the scanner on vulnerabilities. At present, they are not widespread, but they look very promising.
  • Network - the most popular today. Perform checks remotely, connecting through network services.

There are many manufacturers of vulnerability scanners, there are a lot of reviews and tests highlighting the product of a particular company. Here are some of the most common scanners: Nessus, XSpider, IBM Internet Scanner, Retina, Shadow Security Scanner, Acunetix, N-Stealth.

XSpider (replaced by MaxPatrol) is a scanner from the Russian manufacturer Positive Technologies. It has a truly extensive list of features - heuristic analysis and server type detection, full port scanning and service mapping, checking for standard passwords, SQL injection analysis, XSS attacks, and almost daily vulnerability updates. Compared to competitors, the scanner demonstrates better identification of services and applications, as a result providing more and more accurate detection of vulnerabilities with a minimum percentage of false alerts. The product is one of the best solutions not only on the Russian, but also on the world stage, so we decided to highlight it.

What needs to be changed?

Securing a web resource is a process that combines a certain set of actions. The existing system is first examined for security, then a number of measures and works are determined to be done to achieve this security. It can be the services of programmers who develop or optimize the site, and the services of engineers who solve technical issues, and, of course, a certain set of organizational measures. It all depends on the desire and capabilities of the customer.

Web site crawlers are special programs that detect vulnerabilities and defects in web applications.

They allow you to look at the site through the eyes of a robot, and understand how it is seen by search engines that analyze the state of resources and correct possible errors.

The functioning of web site crawlers can be compared with diagnostic devices of the human body, which can detect various diseases and pathologies in a timely manner in order to successfully deal with them, especially if they are in the early stages of development.

What data can be obtained using a web site crawler

When using web crawlers, the following can be detected:

  • Violations in the encoding process - errors associated with incorrect processing of incoming and outgoing data (SQL injection, XSS).
  • Violations during the use and configuration of web applications - an erroneous configuration of its environment (application servers, SSL / TLS and various third-party components).
  • Violations during the operation of the resource - the use of outdated software, easy passwords, the safety of archived data, service modules in direct access to the server.
  • Incorrect IP filter configuration resulting in a denial of service. Including attacks on the server by sending a large number of automatic requests that it is not able to quickly process, as a result of which it “hangs”, ceasing to process requests from real users.
  • Poor OS security on the server. If it is available, fraudsters have the opportunity to create an arbitrary code.

Principles of operation of web site crawlers

  1. Collection of information about the analyzed resource.
  2. Checking resource software for vulnerabilities using special .
  3. Identification of weak sectors in the system.
  4. Making recommendations for removing errors.

Varieties of website security scanners

Based on the purpose, these programs are divided into several groups:

  • network scanners. Open all kinds of network services, determine the OS, etc.
  • Scanners for detecting errors in scripts. Recognize vulnerabilities such as XSS, SQL inj, LFI/RFI, etc., or bugs left after using non-persistent components, directory indexing, etc.
  • Exploit selection tools. They search for exploits in software and scripts.
  • Injection automation programs. This includes utilities that deal with the detection and use of injection.
  • Debuggers. Programs for eliminating defects and editing code in resources.

Additionally, general website vulnerability scanners work, which combine several categories of such programs at the same time.

Free Services

Network Services:

  • Nmap is an open source program. Used to explore networks regardless of the number of objects and determine their state.
  • IP Tools is a protocol analysis service that provides filtering norms, an adapter for filtering, decorating protocols, etc.

Scanners for detecting errors in scripts:

  • Nikto - provides a comprehensive examination of servers for errors, checks a large number of potentially unwanted files and applications.
  • Skipfish - provides recursive verification of applications and subsequent analysis based on special dictionaries.

Exploit recognition programs:

  • Metasploit is a Perl-based program that provides comprehensive testing of a variety of platforms and applications.
  • Nessus - in the process of analysis, it uses both standard testing methods and separate ones that imitate the behavior of fraudsters in the process of introducing into the system.

Injection automation programs:

  • SQLMap is a public keycode service used to analyze SQL vulnerabilities.
  • bsqlbf-v2 is a program for finding blind SQL injections.

Debuggers:

  • Burp Suite is a set of stand-alone services developed on the basis of Java.

Purpose of a site virus scanner

Owners or administrators of resources often encounter a security breach, as a result of which the site falls under bans search engines or blocked by an antivirus program. Fixation letters come from the hosting, users complain about various third-party ad windows or redirects to other resources.

There is a need to identify the causes of these problems. This is a special procedure performed by the site's virus scanners. It includes 2 main steps:

  1. Analysis of hosted components and databases to detect harmful scripts and injections.
  2. Checking the resource on the site page scanner. Identification of hidden redirects and other problems that cannot be found using vulnerability search programs.

The site's virus scanners perform statistical and dynamic analysis for the existence of malicious elements. Statistical analysis is the identification of harmful components, links, spam and other statistical nodes on the analyzed pages. Detection of such elements occurs with the participation of a signature database or a constantly updated list of data. If the malicious element is located in the page code, and it is known to the scanner using the database, then the program will fix it.

Unlike statistical analysis, dynamic analysis is the study of web site documents by simulating user actions. Results are formed - what happens in the end, how the site responds to requests.

The most popular website virus scanners include Nod, Dr.Web, Kaspersky, etc. All of them are quite effective if you use the versions with the latest updates. They work online.

Materials used in the preparation: "9 security tips to protect your website from hackers", "10 Tips to Improve Your Website Security" and "Web Application Security Testing Cheat Sheet"

Public web applications are of interest to hackers as resources or money-making tools. The range of application of information obtained as a result of hacking is wide: paid access to a resource, use in botnets, etc. The identity of the owner is not important, since the hacking process is automated and put on stream. The cost of information is proportional to the fame and influence of the company.

If you set a goal, there will be a vulnerability in the application. In the 2016 website hacking report, Google experts reported that the number of hacked resources increased by 32% compared to 2015, and this is not the limit. Keep this in mind and discard the misconceptions about the inaccessibility of your web resources when planning information security work.

The tips described below will help you sort out and close the top-priority problems in the technical protection of the site.

Use security analysis tools

Before looking for vulnerabilities manually, check the application with automated tools. They perform penetration tests, try to hack it, for example, using SQL injection.

Below is a selection of free tools.

Applications and frameworks

  • OpenVAS scans network nodes for vulnerabilities and allows you to manage vulnerabilities.
  • The OWASP Xenotix XSS Exploit Framework scans a resource for XSS exploits.
  • Approof by Positive Technologies checks web application configuration, scans for vulnerable components, exposed sensitive data, and malicious code.

Prevent SQL Injections

Check and encrypt passwords

Store passwords as a hash, and it is better to use one-way hashing algorithms such as SHA. In this case, hashed values ​​are compared to authorize users. If an attacker hacks the resource and obtains hashed passwords, the damage will be reduced due to the fact that the hash has an irreversible effect and it is almost impossible to get the original data from it. But hashes for popular passwords are easily brute-forced, so use a "salt" that is unique to each password as well. Then cracking a large number of passwords becomes even slower and requires more computational effort.

As for validation, set a limit on the minimum length of the password, and also check for a match with the login, e-mail and site address.

Fortunately, most CMS provide security policy management tools, but using a "salt" or setting a minimum password complexity is sometimes required additional setting or installing a module. When using .NET, it is worth using membership providers because they have a built-in security system with a lot of settings and out-of-the-box elements for authentication and password change.

Control the process of downloading files

Uploading files to a website by a user, even if it is just changing an avatar, carries a threat to information security. The downloaded file, which, at first glance, looks harmless, may contain a script and, when executed on the server, will give an attacker access to the site.

Even if there is a type restriction (for example, only images), be suspicious of user-uploaded files. The extension or MIME type is easy to fake, reading the header or using the image size check functions is not 100% guaranteed, most image formats can inject PHP code that will be executed on the server.

To prevent this, prevent users from executing uploaded files. By default, web servers don't try to serve files with image extensions, but don't rely on the extension alone. There are cases where the image.jpg.php file bypassed this check.

Access restriction methods:

  • rename or change file extensions on upload;
  • change permissions, for example, to chmod 0666 ;
  • create an .htaccess file (see example below) that will only allow access to the specified file types.
deny from all order deny,allow allow from all

A safer way is to prevent direct access to downloaded files by placing them, for example, outside the site root folder. However, in this case, you will need to create a script (or an HTTP handler in .NET) to retrieve the files from the private part and serve them to the user.

Web application protection measures for owners of their own servers:

  1. Set up a firewall, including blocking unused ports.
  2. If you have access to the server from local network create a demilitarized zone (DMZ), allowing access from the outside world only to ports 80 and 443.
  3. If there is no access to the server from the local network, use secure methods (SFTP, SSH, etc.) to transfer files and manage the server from the outside.
  4. If possible, dedicate a separate database server that won't be directly accessible from the outside world.
  5. Limit physical access to the server.

Watch out for error messages

Be careful what appears in application error messages. Report errors to the user in the most concise form that excludes the presence of any technical information. Store detailed information in the server log files. After all, having complete information, it is easier for an attacker to carry out complex attacks like SQL injection.

To keep your finger on the pulse of the project, install an error monitoring system. For example, Sentry , which automatically receives errors from handlers in the application code and through the form from users, and also provides a panel for managing them in real time.

Check incoming data

Control the data received from web forms, both on the client side and on the server side. The browser checks for simple errors like an empty required field or text entered in a numeric field. These checks are bypassed, so server-side monitoring is required. Lack of server-side validation leads to attacker exploitation of injections and other types of attacks.

Assign file permissions

File permissions define WHO and WHAT can do with it.

AT * nix systems files have 3 access options, which are represented as numbers:

  • "Read" (4) - reading the contents of the file;
  • "Write" (2) - change the contents of the file;
  • "Execute" (1) - execution of a program or script.

To set multiple permissions, just add their numerical values:

  • "Read" (4) + "write" (2) = 6;
  • "Read" (4) + "write" (2) + "execute" (1) = 7.

When assigning rights, users are divided into 3 types:

  • "Owner" (owner) - the creator of the file (changeable, but there can be only one);
  • "Group" (group) - a group of users who receive permissions;
  • "Others" (others) - other users.

Setting the owner of access rights to read and write, to the group - to read, to others - access denial looks like this:

Final representation: 640 .

For directories it is similar, but the "execute" flag means to make it a working directory.

.

The site is like a garden: the more work is invested in it, the more generous the fruits. But it also happens when a watered, fertilized and carefully groomed site suddenly flies out of the search results with a bang. What's this? The intrigues of competitors? Usually the reason is much more banal - viruses have started on your web resource.

So, where do viruses on websites come from, what are the symptoms to identify them, how to check your favorite brainchild for the presence of malware and how to protect it from all this evil spirits.

Sources, signs and goals of viral infection of Internet resources

Ways for viruses to enter websites much less than, for example, devices. More precisely, there are only 3 of them:

  • An infected computer from which files are uploaded to the site. This factor accounts for more than 90% of cases.
  • Breaking. It can be targeted, for example, if you were "ordered" by business competitors or the resource somehow attracted the attention of intruders, and random - because it was not closed well.
  • Vulnerabilities of CMS, server systems, plug-ins and other software that sites come into contact with.

How do viruses show their presence:

  • The number of visitors is sharply and unreasonably decreasing. A web resource loses its positions or falls out of the search engine results. When you try to open it in the browser, instead of pages, formidable warnings appear, like this:
  • The design of the pages changes spontaneously. There are "left" advertising banners, blocks, links and content that you did not post. If cash payments are made on the resource, payment details may change.
  • The functionality of the site is broken, the links do not open what they should.
  • Visitors complain that antiviruses swear at your site or that after opening it, signs of infection appeared on their devices.

What is the malicious activity of viruses on Internet resources:

  • In the theft of content, databases, traffic, money.
  • In infecting visitors' devices and other vulnerable sites on the same server.
  • In redirecting your visitors to the resources the attackers need, for example, by installing doorways with spam links or adding a malicious mobile redirect code to .htaccess. This code redirects to other sites only those who came with mobile devices.
  • In boosting someone's search positions at your expense.
  • Sending spam and malicious messages from your mail. Often with the aim of adding your email to the mail databases of malicious spammers so that your subscribers and users do not receive letters from you.
  • In full or partial incapacitation of a web resource, as well as in its intentional removal from search indexing (cloaking).
  • In the installation of web shells and backdoors on the server, with the help of which the attacker gets remote access to the server file system.

Methods for diagnosing site security

There are several ways to check a website for viruses. The fastest and easiest, but rather superficial option is to check with online antivirus scanners. You should always start with it when there is even the slightest suspicion of the presence of malware.

If an online site check revealed a threat, then it is advisable to conduct a full file-by-file scan using antivirus programs.

In addition, some webmasters practice the manual method of scanning for viruses - opening and viewing each suspicious file for bookmarks. The search is carried out by signatures (code fragments that are often found in malicious objects) and by comparing potentially infected files with clean ones. If you have the knowledge and experience, this method can be the most reliable, because even the most powerful and rated antiviruses miss threats.

Security issues on websites are often the first to be noticed by search engines:

  • Yandex.Webmaster displays information about them on the "Diagnostics" - "Security and violations" page.

  • Google Search Console- in the "Tools for Webmasters" - "Status" - "Malware" section.

If malware is detected, follow Yandex and Google's recommendations for finding and eliminating it. And then check the site with online crawlers.

Online scanners to check websites for viruses and hacking

i2p

i2p is a simple free Russian-language service for quickly checking web resources - in whole or individual pages - for malicious content. The analysis takes a few seconds, but the result, alas, is not always reliable. "Suspicions of a virus", as in the example below, can be quite harmless. They just need more attention.

is one of the most famous and popular antivirus scanners online. Scans Internet resources (as well as any files) with 65 antivirus engines, including Kaspersky, Dr.Web, ESET, Avast, BitDefender, Avira, etc. Displays the reputation of the verified site according to the Virustotal community vote. The service interface is in English only.

To scan a web resource on VirusTotal, open the URL tab on the main page, paste the link into the "Search or scan URL" field and click on the magnifying glass icon.

The service does not just report the cleanliness or infection of the website, but displays a list of checked files with notes about what aroused suspicion. The analysis is based on our own and global anti-virus databases.

Other sections of the service are filled with articles about diagnostics, self-manual removal of viruses, protection against infection, backup and other materials about the security of Internet resources.

Dr Web laboratory analyzes the state of websites using only its own databases and algorithms.

Based on the scan results, a report is generated:

  • Whether malware was detected on the object.
  • Whether it is in someone's databases of malicious objects.
  • Does it redirect visitors to other resources.

The results of the file scan and additional information about suspicious facts are displayed below.

xseo

The unsightly Xseo web service is actually more informative and functional than many. It checks sites for more than six million known viruses, for phishing, and also displays their security ratings according to MyWOT, Yandex and Google. In addition, Xseo contains a ton of other useful and free SEO tools. Access to some of them requires registration.

- another free service security checks of Internet resources. It is able to detect signs of infection with known malware, find errors on websites, "punch" them through blacklisting databases and determine the relevance of the CMS version. Service interface in English, Spanish and Portuguese.

— a tool for a comprehensive check of Internet resources for infection and hacking. Detects the following types of threats:

  • encrypted scripts.
  • Hidden redirects.
  • Spy bookmarks, inserts and widgets from suspicious sites.
  • Drive-by attacks (download malware without the knowledge of the user).
  • Spam links and content.
  • Errors and signs of deface.
  • Adding to the blacklists of search engines and antiviruses.

After a free scan "on the spot" he offers to order virus treatment and site protection services from his specialists. Already paid.

Checks the reputation of links — whether the resource is on the list of infected or phishing sites according to the Kaspersky Security Network databases.

The scanner searches for malware both in databases and on the basis of heuristic analysis, due to which it sometimes detects threats that antiviruses do not yet know about. In addition to scanning, the service offers paid services for cleaning websites from viruses and subsequent infection prevention.

Quttera interface in English.

The Russian-language service checks websites using 20 different antiviruses. In addition to this, it offers paid services for cleaning from found malware and installing permanent protection tools.

Checking the site with an antivirus on a computer

The next step in checking a web resource for security is scanning all of its files with an antivirus program installed on a PC. For this task, any complex a\v product with fresh databases will do. You can use the one you trust more.

Before scanning, you have to download the contents of the site to a separate folder on your PC or to removable media, and then, without touching the contents of the folder, run the scan. Do not click on files, otherwise malware may infect your computer.

In case of detection of threats, it is best and fastest to replace the infected files with clean ones, taking the latest from backups. If there are no copies, you can also delete dangerous objects manually, but before that, be sure to make a backup.

What could be potentially dangerous:

  • Embedded frames and scripts (can be found under iframe and javascript).
  • Downloadable scripts.
  • Redirects to third-party resources (even normal and uninfected).
  • Loadable pictures and other multimedia objects.
  • Other external additions.
  • Files with a modified date that is close to the expected date of infection.

Of course, you should not delete everything in a row, first these objects must be studied. If independent analysis causes difficulties, it is better to entrust it to specialists.

After cleaning, be sure to change the passwords that were used to access the site and hosting account.

How to protect your site from viruses

As already mentioned, the bulk of cases of malware getting onto web resources are the result of infection of the computer through which the administrator manages the site. That's why:

  • Keep an eye on the health of your computer: restrict access to it to family members, refuse unverified programs, do not click on unknown links, run a full antivirus scan from time to time, etc.
  • Do not trust the storage of passwords from the site, databases and hosting account to browsers and FTP/SSH clients. Use protected . The passwords themselves should be long and complex. Don't forget to change them periodically.
  • Try to access the site only via SFTP or SSH, the FTP protocol is insecure.
  • Do not delete error logs and site access logs before they could be useful to you.
  • Update CMS, additional modules and plugins in a timely manner. If these objects are compromised or no longer supported, they are vulnerable to malware and hacker attacks. Replace them with safer alternatives. Also refrain from using software from unverified sources.
  • Install a good antivirus on the site, such as the AI-Bolit virus and hack cleaning script, or connect it to an automatic treatment and protection service like Virusdie.

Learn more about AI-Bolit and Virusdie services

AI-Bolit (Aibolit) is a lightweight, undemanding anti-virus script designed to search for all types of malware and vulnerabilities on hosting and websites. Supports any Operating Systems, scripts and SMS. For personal non-commercial use, the basic functions of the service are available free of charge. In case of infection, specialists help with the analysis of reports, treatment and installation of preventive protection.

Virusdie is a comprehensive antivirus support service (antivirus, firewall, explorer and file editor). In addition to automatically searching for and removing viruses, it helps to remove blocking and other sanctions from hosting providers, antivirus software and search engines from the site. Supports most popular SMS. Service services are paid, protection of one site costs 249-1499 rubles per year.

Clean Internet to you!