Log Management Planning Calculator

I was using the cloud service by EditGrid but they went offline – Use the three calculators I built below instead


Select the “click to edit” button at the top of the spreadsheet to start entering data. Select the drop-down button in the top left corner for features such as full-screen, download as excel and info related to EditGrid.

To use, just enter total quantity of each device type into the “Device Quantity” column. The “Per Device EPS” column provides industry averages for the event per second (eps) rate from each device type and you can change the values with your own. Next, modify the values next to the text highlighted in red under the “Event Capacity Planning” section to finish your planning.

You may want to do this separately for every remote site you plan on aggregating event for to model the bandwidth and storage planning.
Go here for calculator



9 Responses to Log Management Planning Calculator

  • Hi there,

    I’m drafting an RFP for MSSP-SOC. Part of RFP backend is give the following.

    (a) Selection Criteria for SIEM Technology which is best for MSSP – SOC for 25,000 Devices and over 300 Client support

    (b) EPS / MPS Calculation for mixed 25,000 Devices, and sizing the Storage for 2/3 years

    (c) Which SIEM has modular Scalability, Computation Caculation , RAM requirements per 10,000 EPS(vendor Agnostic studies)

    (d) Costing and and Budgetory Calaculations (Approximate Investment)

    (e) Retun of Investment (ROI) high level

    (f) HR required to operate 24X7X365

    Can anybody extend help to have some Crediable studies and Analysis for me to read and do qualitative work.

    Regars and Looking forward for your kind and swift support.

  • Hi there…I’m trying to size a SIEM solution and I’m looking to find examples of what an average EPS count would be for a Windows Domain Controller/DNS server and one for a DHCP server.

  • Hi Netcerebral,

    I’m drafting an RFP for standard SOC environment. Part of RFP backend is give the following.

    (a) Client has suggested to use HP ArcSight for 100K EPS

    (b) Sizing the Storage for 3 years

    Can anybody extend help to have some Crediable studies since am looking for How much storage space i need to quote and which vendor (SAN/NAS/hitachi/netApp or so on)

    Regards and Looking forward for your kind and swift support

    • Hi there,
      If the client has already decided on which vendor they will used for their Centralized Log Management (CLM) it makes the planning easier as you could simply work with the vendor’s solution architect to help in building out that portion of the infrastructure (and they would know all of the metrics and nuances around integrating with their solution).
      100K EPS is huge when you consider 100,000 x 86,400 (seconds in a day) and then what ever overhead is applied to each message with the vendor’s normalization and indexing techniques.
      That’s 8,640,000,000, or 8.6 billion events, and for the sake of argument, we pad each message with a standard of 1024 bytes after normalization, you are talking about 8,847,360,000,000 bytes, or 8.8 Terabytes a day!
      Fortunately, what makes CLM products so attractive is their ability to compress the data store. However, while their marketing specs may tell you they can do 10:1 or 20:1 compression of the event store, real-world numbers actually vary based on the amount of white space in the message (device type, logging levels, etc), number of message fields being indexed and other considerations, specific to how the event sources are logging.
      Given this fact, it’s safe to work with a lower compression ratio for long-term storage (i.e., if the vendor states a 10:1 compression for at-rest and archived data, use a conservative number such as 5:1).
      Therefore, if you are generating 8.8 TB of data daily, you’ll need a CLM with at least 4.4 TBs of writable space A DAY! This is where you need to check with a vendor’s solution architect to identify the size of appliance or software license needed to store this amount of data.
      Now the 3 year mandatory archive period – this is generally dictated by a governance or compliance initiative. What many customers fail to do is classify their data and optimize the archiving by getting rid of the “white noise” that is usually not required to be kept for the dictated retention periods. A rule of thumb here is that data can generally be classified by business purpose. If 33% of the data is needed for compliance reporting (with 3 year retention) and 33% is needed for IT operations (60 day retention) and the remaining 34% of the data is considered useless or “white noise”, then why would a customer want to keep 100% of the logs for 3 years and have to facilitate a 67% slack space for useless data?
      As it stands now, the 4.4 TBs of daily compressed logs would require 1.6 Petabytes of storage for 365 days of archive. That will be roughly 4.8 Petabytes for three years!
      This translates to 100’s of thousands of dollars in managed storage / vaulting costs, which usually represents a significant portion of any IT budget. If customers could optimize their data by classifying it, de-duplicating, aggregating and filtering out the white-noise, they could work with a digestible amount of data, both for retention and allocating the necessary budget.
      As far as which storage solution is right for this client, that will depend on their appetite to do the caring-and-feeding for a NAS/SAN vs a managed storage solution. Also, corporate security policies may dictate whether they use a SAN or NAS, especially if they don’t allow CIFS / NFS mounts across their servers.
      My apologies if I am only scratching the surface and can’t answer all of your questions with specifics regarding any one vendor, but I try to keep my blog vendor-agnostic and provide “generic” considerations for sizing solutions.

  • I can’t find the calculator anymore, where can i download it from?

  • Clicking “Go here for calculator” hyperlink yields this:

    This webpage is not available

    Can you provide the new link? Thanks… Joey

  • Greatly appreciate the updated hyperlinks Very helpful, thanks!

Leave a Reply

Your email address will not be published. Required fields are marked *

Are You Human? * Time limit is exhausted. Please reload CAPTCHA.

This site uses Akismet to reduce spam. Learn how your comment data is processed.