đ° Welcome to MyBunny.TV â Your Gateway to Unlimited Entertainment! đ°
Enjoy 10,000+ Premium HD Channels, thousands of movies & series, and experience lightning-fast instant activation.
Reliable, stable, and built for the ultimate streaming experience â no hassles, just entertainment! MyBunny.TV â Cheaper Than Cable ⢠Up to 35% Off Yearly Plans ⢠All NFL, ESPN, PPV Events Included đ°
đ Join the fastest growing IPTV community today and discover why everyone is switching to MyBunny.TV!
Bonifield J. Data Engineering for Cybersecurity.Build Secure Data Pipelines 2025
To start this P2P download, you have to install a BitTorrent client like
qBittorrent
Category:Other Total size: 55.94 MB Added: 3 weeks ago (2025-08-23 07:27:02)
Share ratio:59 seeders, 0 leechers Info Hash:6291674DDAA36A3C4961AB143D7837F85A49F698 Last updated: 11 hours ago (2025-09-18 13:59:34)
Report Bad Torrent
×
Description:
Textbook in PDF format
Turn raw logs into real intelligence.
Security teams rely on telemetryâthe continuous stream of logs, events, metrics, and signals that reveal whatâs happening across systems, endpoints, and cloud services. But that data doesnât organize itself. It has to be collected, normalized, enriched, and secured before it becomes useful. Thatâs where data engineering comes in.
In this hands-on guide, cybersecurity engineer James Bonifield teaches you how to design and build scalable, secure data pipelines using free, open source tools such as Filebeat, Logstash, Redis, Kafka, and Elasticsearch and more. Youâll learn how to collect telemetry from Windows including Sysmon and PowerShell events, Linux files and syslog, and streaming data from network and security appliances. Youâll then transform it into structured formats, secure it in transit, and automate your deployments using Ansible.
Youâll also learn how to:
Encrypt and secure data in transit using TLS and SSH
Centrally manage code and configuration files using Git
Transform messy logs into structured events
Enrich data with threat intelligence using Redis and Memcached
Stream and centralize data at scale with Kafka
Automate with Ansible for repeatable deployments
Whether youâre building a pipeline on a tight budget or deploying an enterprise-scale system, this book shows you how to centralize your security data, support real-time detection, and lay the groundwork for incident response and long-term forensics.
Prerequisites:
This book assumes a basic familiarity with the Linux operating system and its command line. Weâll walk through all configuration steps and commands to run, but youâll find it easiest to follow along if youâre used to working in the terminal. Here are some other topics you might want to familiarize yourself with before moving on:
- Virtual machines
Virtual machines are virtualized computers running inside of other computers. For example, you might run a Linux virtual machine on your Windows desktop or laptop. The computer that hosts the virtual machine is called a hypervisor. Virtual machines are helpful for practicing new tools and concepts, as you can take a snapshot of their current state and then return the machine to that state if anything goes wrong.
- TCP and UDP
The tools covered throughout this book make network connections using the Transmission Control Protocol (TCP) and User Datagram Protocol (UDP). TCP is a protocol that requires systems to acknowledge receipt of the data they exchange, making it resilient to connectivity issues. TCP is useful for tasks like file transfers and email delivery, where itâs important not to lose any data. UDP doesnât require an acknowledgment from a recipient, and itâs useful for tasks like watching videos or listening to audio without constantly being interrupted. If some of the bytes drop while traveling from the server to your screen, the show goes on without a hiccup.
- HTTP and TLS
Youâll use the Hypertext Transfer Protocol (HTTP) to request and receive data from the web. HTTP is an unencrypted protocol, meaning an overly curious or malicious actor on the same network could position themselves to inspect the content of your communications with a web server. Today, most websites use HTTPS, which encrypts traffic with Transport Layer Security (TLS), the âSâ in âHTTPS.â Weâll cover TLS further in Chapter 2 and use it to encrypt all network traffic when able, including for data fetched from the web and communications between tools.
- SSH
Secure Shell (SSH) is a technology that creates an encrypted tunnel, or network connection, from one computer to another. Itâs often used to remotely control a computer as if youâre using its local keyboard. Youâll configure several virtual machines in this book and then make extensive use of SSH to interact with them. Youâll also explore an automation tool, Ansible, that uses SSH to orchestrate actions on many hosts at once. Chapter 2 discusses SSH in more detail.
- Scripting languages
In this book, youâll write a few scripts using the Python programming language and use the Ruby programming language to transform data. Weâll walk through these scriptsâ contents in detail. That said, you could also automate many of the commands weâll cover using shell scripts that execute Linux commands. Some experience creating shell, Python, or Ruby scripts is recommended, but not necessary, to follow along.
Who This Book Is For:
This book is for anyone tasked with creating a cybersecurity monitoring system or centralizing a businessâs logs. It may be particularly useful for those operating with constrained budgets, but it is relevant to those operating with enterprise-level funding as well. Network defenders can use this book to transform events taken from a multitude of systems and tools into a standard format before storing them. Administrators and engineers can also use this book to manage the many device health logs flowing through the network. Offensive testers can also use it to read and transform the variety of outputs from hacking tools to store them for client reports, automate the comparison of results, and dispatch additional tool actions. Those seeking to automate defensive or offensive actions may find centralizing and standardizing logs useful as well