Bot Management Technology Landscape: A 360-degree view (Part 1)


The technology landscape is ever evolving and within it, internet bots have emerged as powerful tools that automate repetitive tasks on the World Wide Web(www) that would otherwise require human intervention. More than 60% of the internet traffic today consists of bots which justifies the importance of bots. However, they are not always used with the intent of adding value to the internet ecosystem. Repurposing them for malicious purposes has become a common practice.

In Part 1 of this blog series, we will cover a 360-degree view of the Bot Management landscape starting with this blog where we cover the phases of bot development. This will include going over the crucial preparatory steps involved in starting application reconnaissance for laying the groundwork of bot development and then we will go over the actual stages of development.

In Part-2 of the blog we will cover why a Bot Management solution like Radware Bot Manager is essential to thwart bot attacks that are continuously on the rise.

But let us first start by understanding some basics of internet bots.

Internet Bots

Internet Bots are software solutions designed to perform repetitive tasks on target applications autonomously, simulating human interactions and decision-making.

Few examples of internet bots are:

  1. Chatbot, which simulate human conversation by providing pre-programmed answers for phrases entered by a person.
  2. Crawlers / Spiders, which scan pages across the internet and index them for search engines.
  3. Monitoring Bots, which monitor uptime and health of websites.
  1. E-commerce Recommendation Bots, which provide personalized product recommendations to end-users based on their search history on e-commerce sites.
  2. Scraping Bots, which gather relevant content from websites.

Bot Developers, use target application(s) hosted by others to build upon. But ethical implications of development are always in question while doing so. Developers might choose to involve leveraging the existing application to enhance functionality or use it to find vulnerabilities that can be leveraged for malice. For example:

Lead generation companies may use scraping bots to gather information (like email id, phone number etc.) from various online sources for the purpose of building databases. The details from the databases are then filtered industry-wise and sold off to clients. Clients use these details to trigger targeted marketing campaigns and attain potential customers. This is a legitimate use case of scraping bots.

In contrast, a scraping bot can also be used for price scraping. It is pre-dominant in the ecommerce industry today where a product listed for $10 on an e-commerce site ABC, can be scraped by a competitor XYZ. XYZ can then offer the same product on their ecommerce website for $9 instead. This would essentially help XYZ snatch the deal from ABC for the same product being sold on both ecommerce sites. This is a malicious use of the scraping bot and can impact businesses.

Let us now dive into the behind-the-scenes tasks carried out by bot developers for building a bot.

Application Reconnaissance

Before development, developers perform Application Reconnaissance on the target application.

Application reconnaissance, also referred to as “app recon”, is the process of gathering intelligence from a target software application to understand its structure, architecture, functionalities, operations, potential weaknesses, and security vulnerabilities.

The methods and tools used for it can vary widely but the idea is to get a better understanding of an application by any means.

App Recon can involve the following processes:

  1. Passive Reconnaissance: Passive reconnaissance is used to extract information from target application without actively engaging with its systems. Techniques used for it are:
    • OSINT (Open-Source Intelligence): Gather publicly available information about the application and its organization from sources like social media, forums, or publicly accessible documents.
    • Ecosystem evaluation: This involves gathering information about the backend infrastructure and configurations used for running the target application. Information such as: the type of computers being used, their operating systems, supporting software running on these computers, the target application’s programming language etc., are commonly assessed.
    • Network evaluation: Eavesdropping on the network with which the application is interacting is also reconned. For example, bot developers would search for DNS information such as IP Address and domain ownership. Internet tools like Netcraft, nslookup or whois can be used for this type of research. Tools like Wireshark are common for sniffing. Tools like Shodan are used to identify vulnerable devices connected to the application over the internet.
  1. Active Reconnaissance: Active reconnaissance is used to extract information from target application by actively engaging with its systems.
    • Network Mapping: Bot developers try and connect with target systems to map their network topology.
    • Vulnerability Scanning: This commonly used technique involves using automated tools on the target application to identify potential weaknesses that can be exploited. SQL injection, and cross-site scripting (XSS) are commonly carried out in vulnerability scanning.
    • Port Scanning: This helps determine if the application’s attack surface by scanning open ports via which data to the target can be sent or received. Tools like Nmap, Angry IP Scan, Netcat etc., are commonly used for this.
    • DNS Enumeration: DNS enumeration is used for locating all the DNS servers, database records from zone files and sub-domains linked of a root domain for an organization. These can sometimes yield computer names, IP addresses and usernames of target systems also.
  1. Crawling and Indexing:
    • Crawling: Crawlers can be used on the target application for traversing through the entire website page by page, mapping the topology of the entire website. It is commonly used by search engines for indexing (see next point) but is common in app recon to map site structure.
    • Indexing: Indexing is used to catalogue the links, content and metadata identified in a webpage after crawling a website.
  1. Content Discovery: Attempts can be made to discover hidden or sensitive directories and files by trying random URLs of the application. This can expose files that were not meant to be public.
  1. Error Messages and Responses: Analysing error messages and responses from the target application to identify potential vulnerabilities or misconfigurations.
  1. Feedback and User Interaction: Bot Developers may interact with the application to gather information about its responses, user feedback, or functionality.

Bot Development

Bot development involves several stages from conceptualizing a bot’s function to its development and deployment. The specific stages can vary but here are the common ones:

  1. Define scope of the bot:
    1. Define purpose and functionalities of the bot.
    2. Identify goal, demographics, preferences and need for the bot.
    3. Determine platforms and channels where the bot will be deployed (e.g., website, mobile applications, messaging app, social media).
    4. Define the use cases in which the bot will be applied.
  1. Design, Architect and Development:
    1. Decide on the technology stack and development tools to be used. Browser Automation tools and Python libraries like Selenium, Scrappy, Beautiful Soup etc., are commonly used by Bot Developers for scripting.
    2. Considerations on where the bot would reside. For example, Botnets would need to run on cloud platform whereas internal support chatbots deployed by an organization can run on-premises.
    3. Decision on how the data would be collected. Web scrapers, APIs, Robotic Processing Automation bots, Natural Language Processing, Optical Character Recognition Bots that can scan documents / images etc., are commonly used for this.
    4. Planning around how data collected by the bot can be processed. Data collected should then be cleaned, transformed, and analysed for meaningful insights.
  1. Testing and Quality Assurance:
    1. Testing for bugs in the bot is a crucial step. The bot should function for the task it is intended for.
    2. Because bots are mean to run on various environments, they should be tested for machine compatibility with different specifications.
    3. Verify bot responses and data collection accuracy.
    4. Test performance.
  1. Deployment:
    1. Host the bot in the environment (e.g., cloud server, web server) it is intended for.
    2. Deploy the monitoring tools that can be used to track the bot’s performance.
  1. Scaling (if needed):
    1. Plan for scalability as user demand grows. This may involve optimizing the bot’s architecture and resources.

Next Up — Part 2 of this blog where we will go over a few additional aspects of how bot developers work. But the focus of that blog would be to see why a dedicated Bot Management solution like Radware Bot Manager is critical for any organization to protect against the attempts of these bot developers. Do keep watching this space.

Protecting against all types of attacks is just one of the arrays of reasons so many organizations rely on Radware. If you would like to speak with a talented and tenured Radware security professional, you can reach them HERE. They would love to hear from you.

Amrit Talapatra

Amrit Talapatra is a product manager at Radware, supporting its bot manager product line. He plays an integral role in helping define the product vision and strategy for the industry leading Radware Bot Manager. With over 10 years of experience in the security and telecom domain, he has helped clients in over 30 countries take advantage of offerings from the ground up. He holds bachelor’s and master’s degrees in computer applications.

Contact Radware Sales

Our experts will answer your questions, assess your needs, and help you understand which products are best for your business.

Already a Customer?

We’re ready to help, whether you need support, additional services, or answers to your questions about our products and solutions.

Locations
Get Answers Now from KnowledgeBase
Get Free Online Product Training
Engage with Radware Technical Support
Join the Radware Customer Program

CyberPedia

An Online Encyclopedia Of Cyberattack and Cybersecurity Terms

CyberPedia
What is WAF?
What is DDoS?
Bot Detection
ARP Spoofing

Get Social

Connect with experts and join the conversation about Radware technologies.

Blog
Security Research Center