Deployed Splunk on Portainer and setup all my docker containers to stream logs to Splunk.

Seems to be free as long as Splunk doesn’t ingest over 500MB a day.

Opinions?

  • the4thaggie@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Splunk architect for about 7 years here. 500MB logs a day is a lot for a home lab log ingest. Your biggest issue will probably be the lack of a login prompt if you expose it to the internet. I also think you lose the ability to do a deployment server role to centrally push log collection configs to universal forwarders.

    We had to move to Elastic because the higher ups saw a slight savings of money. I’m paying the price in engineering time because of it. Splunk SPL (search language) and sheer amount of premade integrations for add-ons (parsing logs into extracted fields for example) and premade apps (Splunk knowledge objects like dashboards, reports, alerts) far and exceed the Elastic stack.

    Though if you’re looking for turnkey solutions without learning how to search, the power of Splunk will be mostly missed. Same for Elastic I suppose. I find Splunk’s approach to be more intuitive. Elastic is like Google and AWS (if you’re familiar with their design decisions): powerful but completely asinine and unintuitive until you get past the learning curve

  • HTTP_404_NotFound@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    As a splunk architect- I really enjoy it.

    For home use, its ok. But, without the enterprise features, it limits a lot of the capabilities.

    You CAN use cribl.io with it, to replace a lot of the missing features… and to reduce the amount of data being stored. It has an extremely generous 1T/day free plan.

    You can also use the universal forwarders, as they do not have a license attached.

    Data is only licensed when it is written by an indexer.

    There, are also ways of using the enterprise plan… by selectively not storing certain files under /etc… and restarting the container every few days.

  • Dizzybro@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Splunk is crazy, crazy powerful and fast. If you are lucky enough to work for a firm that can afford their licensing (so you can make some really advanced queries), it’s an awesome tool to learn in depth.

  • bufandatl@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Use ELK. It’s basically the same but open source and unlimited for free. Also splunk sucks. Have to use it at work and it really isn’t great. (My personal opinion)

  • cylemmulo@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Splunk is definitely a lot at first. Not the most intuitive setup but cool how much flexibility you get.

  • dagamore12@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    If you are homelabbing it for to grow work skills or to add it as a know item on to your resume, it can be a good thing, for a decent sized lab 500mb is plenty.

    If you are looking for a ;homelab monitoring tool for securing your system I would look at Wazuh, it does all that splunk does it does more of it out of the box, is a quicker setup and has more built in tools that does a lot more than anything other then a very highly customized splunk setup will do.

  • shifty21@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Splunk employee and homelabber here! (I’m one of a few moderators for r/splunk)

    I have the free version running at home collecting pihole, opnsense, proxmox, HomeAssistant, and all the Windows and Linux machines both physical and virtual.

    I see all the shenanigans on my network! Like when my teenage daughter is up at 2AM on her phone on a school night. And when my wife was looking at law firms in our area.

    The one thing I wish Splunk could do if figure out why my kids’ therapy bills are so high.

  • canassa@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I’ve used both Splunk and Datadog in my current job, but I wasn’t particularly impressed with either. In both cases, the costs escalated quickly. Now, we’re limited to a 15-day retention period, which, in my opinion, significantly diminishes the system’s usefulness.

    In another company, where I had greater decision-making authority, I took a different approach. I directed all journald logs to a central repository using systemd-journal-remote and provided SSH access to developers who needed to view the logs. This setup was straightforward and efficiently handled a vast volume of logs at no cost. Journald’s binary and structured format allowed for advanced searches. Additionally, I configured our primary Python application to log directly to journald, utilizing its structured logging features.