A Bash-based automation tool that parses system/application logs to detect errors, warnings, and critical events.
It generates structured reports and supports scheduling via cron, reducing manual log analysis efforts in IT operations.
- Large log files are difficult to scan manually.
- Critical issues may be missed, causing downtime.
- Businesses need automated, repeatable monitoring.
✔ Automates error detection
✔ Provides clear, timestamped reports
✔ Easy to integrate into existing workflows
- Bash Scripting
- Linux Utilities:
grep,awk,sed,cut,sort,uniq,gzip - Cron Jobs → automation & scheduling
- Git/GitHub → version control & portfolio showcase
The project is developed using Agile methodology with 4 sprints (12 days total).
Each sprint delivers incremental functionality, ensuring continuous progress and usable features.
| Sprint | Duration | Goals | Deliverables |
|---|---|---|---|
| Sprint 1: Core Setup | Days 1–3 | • Repository setup & docs • Script skeleton • Basic error filtering ( ERROR, WARNING, CRITICAL) • Error counts |
✅ Repo initialized ✅ log_analyzer.sh skeleton ✅ Basic filtering working |
| Sprint 2: Categorization | Days 4–6 | • Extract timestamps • Group by process/service • Date/time filters • CLI flags/options |
✅ Categorized logs ✅ CLI usability improved |
| Sprint 3: Reporting | Days 7–9 | • Generate structured reports (.txt / .csv) • Timestamped filenames • Top recurring errors • Color-coded console output • Report compression |
✅ Reports with insights ✅ Compressed archives |
| Sprint 4: Automation & Polish | Days 10–12 | • Automate daily execution with cron • Log rotation • Error handling • Optional email reports • Final documentation & screenshots |
✅ Automated daily runs ✅ Polished docs ✅ Portfolio-ready project |
| Day | Task | Sprint |
|---|---|---|
| 1–3 | Core script & error counts | Sprint 1 |
| 4–6 | Timestamps, filters, CLI options | Sprint 2 |
| 7–9 | Reports, summaries, compression | Sprint 3 |
| 10–12 | Automation, polish, final docs | Sprint 4 |
log_analyzer.sh– Core script- Reports (
.txt/.csv) - Automated daily runs via
cron - Documentation (README + examples)
- Integration with Splunk/ELK/Grafana for dashboards
- Alerting via Slack/Email APIs
- Support for JSON/structured logs