Core functionality for Blockingmachine, providing robust filter list processing and rule management for AdGuard Home and similar applications.
- Blockingmachine Desktop - Desktop application
- Blockingmachine CLI - Command line interface
- Blockingmachine Database - Filter list repository
- π Fast filter list processing
- π Rule deduplication
- π₯ Remote list fetching with retry logic
- β¨ Clean rule formatting
- π‘οΈ AdGuard Home compatibility
- πͺ TypeScript support
- π Automatic retries for failed downloads
- π― Efficient memory usage
- β‘ Async processing support
# Using npm
npm install @blockingmachine/core
# Using yarn
yarn add @blockingmachine/core
# Using pnpm
pnpm add @blockingmachine/coreimport {
RuleDeduplicator,
parseFilterList,
fetchContent,
} from "@blockingmachine/core";
// Basic usage
const rules = await parseFilterList("||example.com^");
const deduplicator = new RuleDeduplicator();
const uniqueRules = deduplicator.process(rules);
// Advanced usage with remote lists
async function processRemoteLists(urls: string[]) {
const deduplicator = new RuleDeduplicator();
let allRules: string[] = [];
for (const url of urls) {
const content = await fetchContent(url);
if (content) {
const rules = await parseFilterList(content);
allRules = [...allRules, ...rules];
}
}
return deduplicator.process(allRules);
}Process and deduplicate filtering rules.
class RuleDeduplicator {
constructor(options?: { caseSensitive?: boolean; keepComments?: boolean });
process(rules: string[]): string[];
addRule(rule: string): void;
clear(): void;
}caseSensitive(boolean, default: true): Preserve case when comparing ruleskeepComments(boolean, default: false): Retain comment lines in output
Parse raw filter list content into individual rules.
interface ParseOptions {
skipComments?: boolean;
skipEmpty?: boolean;
trim?: boolean;
}
// Example usage
const rules = await parseFilterList("||example.com^\n||example.org^", {
skipComments: true,
skipEmpty: true,
trim: true,
});Fetch remote filter lists with built-in retry logic.
interface FetchOptions {
timeout?: number;
retries?: number;
retryDelay?: number;
}
// Example with options
const content = await fetchContent("https://example.com/filterlist.txt", {
timeout: 5000, // 5 seconds
retries: 3, // Try 3 times
retryDelay: 1000, // Wait 1 second between retries
});try {
const content = await fetchContent("https://example.com/filterlist.txt");
if (!content) {
console.error("Failed to fetch content");
return;
}
const rules = await parseFilterList(content);
} catch (error) {
console.error("Error processing rules:", error);
}- Memory Management
// Process large lists in chunks
const deduplicator = new RuleDeduplicator();
for (const chunk of chunks) {
const rules = await parseFilterList(chunk);
deduplicator.process(rules);
}- Error Recovery
// Implement retry logic for failed fetches
const content = await fetchContent(url, {
retries: 5,
retryDelay: 2000,
});We welcome contributions! Please see our contributing guidelines for details.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the BSD 3-Clause License - see the LICENSE file for details.
You are free to:
- Use the software commercially
- Modify the software
- Distribute the software
- Place warranty on the software
Under the following conditions:
- License and copyright notice must be included with the software
- Neither the names of the copyright holder nor contributors may be used to promote derived products
- Source code must retain copyright notice, list of conditions, and disclaimer
- Blockingmachine Desktop - Desktop application
- Blockingmachine CLI - Command line interface
- π Documentation
- π Issue Tracker
- π¬ Discussions
- AdGuard for their excellent filter syntax documentation
- All our contributors and users
const sources = [
"https://example.com/list1.txt",
"https://example.com/list2.txt",
];
const processAllLists = async () => {
const deduplicator = new RuleDeduplicator();
const results = await Promise.allSettled(
sources.map((url) => fetchContent(url)),
);
for (const result of results) {
if (result.status === "fulfilled" && result.value) {
const rules = await parseFilterList(result.value);
deduplicator.process(rules);
}
}
};const customProcessor = async (content: string) => {
const rules = await parseFilterList(content, {
skipComments: true,
skipEmpty: true,
trim: true,
});
// Custom processing logic
return rules.filter((rule) => {
// Filter out rules containing specific patterns
return !rule.includes("specific-pattern");
});
};- Process large lists in chunks
- Clear the deduplicator cache periodically
- Use streaming for very large files
const processLargeFile = async (filePath: string) => {
const deduplicator = new RuleDeduplicator();
const CHUNK_SIZE = 1000;
let rules: string[] = [];
// Read file in chunks
for await (const chunk of readFileInChunks(filePath)) {
const parsedRules = await parseFilterList(chunk);
rules = [...rules, ...deduplicator.process(parsedRules)];
if (rules.length > CHUNK_SIZE) {
// Process chunk and clear cache
await processRules(rules);
deduplicator.clear();
rules = [];
}
}
};const concurrentProcessing = async (urls: string[]) => {
const BATCH_SIZE = 5; // Process 5 URLs at a time
const results: string[] = [];
for (let i = 0; i < urls.length; i += BATCH_SIZE) {
const batch = urls.slice(i, i + BATCH_SIZE);
const batchResults = await Promise.all(
batch.map((url) => fetchContent(url)),
);
results.push(...batchResults.filter(Boolean));
}
return results;
};Enable debug logging by setting the environment variable:
# macOS/Linux
export DEBUG=blockingmachine:*
# In your code
const debug = require('debug')('blockingmachine:core');
debug('Processing rules:', rules.length);// Increase timeout for slow connections
const content = await fetchContent(url, {
timeout: 10000, // 10 seconds
retries: 5,
});// Use streaming API for large files
const deduplicator = new RuleDeduplicator({
useStreaming: true,
chunkSize: 1000,
});// Add delays between requests
const delay = (ms: number) => new Promise((resolve) => setTimeout(resolve, ms));
const fetchWithRateLimit = async (urls: string[]) => {
for (const url of urls) {
await fetchContent(url);
await delay(1000); // Wait 1 second between requests
}
};Q: What types of filter lists are supported? A: We support AdGuard-style filter lists, including:
- Domain-based rules (
||example.com^) - Basic pattern rules (
/ads/) - Comment lines (
! This is a comment) - AdGuard Home specific syntax
Q: How large of a filter list can this handle? A: The library is optimized for large lists and can handle millions of rules when used with proper memory management practices (see Performance Tips section).
Q: Why is processing taking longer than expected? A: Several factors can affect processing speed:
- Large number of rules
- Complex pattern matching
- Network latency when fetching remote lists
- System memory constraints
Solution: Use the chunking and streaming options described in the Performance Tips section.
Q: How do I combine multiple filter lists?
const combineLists = async (urls: string[]) => {
const deduplicator = new RuleDeduplicator();
for (const url of urls) {
const content = await fetchContent(url);
if (content) {
const rules = await parseFilterList(content);
deduplicator.process(rules);
}
}
return deduplicator.process([]);
};Q: How can I exclude certain domains from being blocked?
const excludeDomains = (rules: string[], excludeList: string[]) => {
return rules.filter((rule) => {
return !excludeList.some((domain) => rule.includes(domain));
});
};Q: Why am I getting timeout errors? A: Remote lists might be slow to respond. Try:
const content = await fetchContent(url, {
timeout: 30000, // 30 seconds
retries: 5, // 5 attempts
retryDelay: 2000, // 2 seconds between retries
});Q: How do I handle invalid rules? A: Use the parsing options to skip problematic rules:
const rules = await parseFilterList(content, {
skipInvalid: true,
onError: (error, rule) => {
console.warn(`Skipping invalid rule: ${rule}`);
},
});Q: Can I use this with Express.js? A: Yes, here's a basic example:
import express from "express";
import { RuleDeduplicator, parseFilterList } from "@/core";
const app = express();
app.post("/process-rules", async (req, res) => {
try {
const rules = await parseFilterList(req.body.content);
const deduplicator = new RuleDeduplicator();
const processed = deduplicator.process(rules);
res.json({ rules: processed });
} catch (error) {
res.status(500).json({ error: error.message });
}
});Q: How do I save processed rules to a file? A: Use the built-in file system functions:
import { promises as fs } from "fs";
const saveRules = async (rules: string[], filepath: string) => {
await fs.writeFile(filepath, rules.join("\n"), "utf8");
};Q: How often should I update my filter lists? A: Best practices suggest:
- Daily updates for actively maintained lists
- Weekly updates for stable lists
- Implement rate limiting when fetching multiple lists
- Use the
If-Modified-Sinceheader (supported byfetchContent)
Q: How do I handle updates efficiently? A: Use the incremental update feature:
const updateRules = async (existingRules: string[], newContent: string) => {
const newRules = await parseFilterList(newContent);
const deduplicator = new RuleDeduplicator();
return deduplicator.process([...existingRules, ...newRules]);
};- π Initial public release
- π Core functionality implementation
- πͺ TypeScript support
- π₯ Remote list fetching
- π Rule deduplication
- β‘ Async processing
- π₯οΈ Works with Blockingmachine Desktop
- β Improved compatibility with Blockingmachine CLI
- π Bug fixes and performance improvements
- π Rule statistics and analytics
- π Enhanced pattern matching
- π Support for additional filter list formats
- π Better network resilience
- π― Rule optimization algorithms
- π¦ Reduced bundle size
- π§ͺ Extended test coverage
- π Streaming API for large files
- π Internationalization support
- π Enhanced security features
- π Performance improvements
- π§© Plugin system
- π€ Third-party integrations
- π οΈ Improved error handling
- Improved CLI compatibility
- Fixed several critical bugs
- Performance optimizations
- Documentation updates
- Added more comprehensive examples
- Fixed issues with NPM package integration
- Added support for additional filter formats
- Enhanced error handling
- Improved documentation
- Fixed dependency issues
- Fixed CLI integration bugs
- Initial public release
- Core functionality stable
- Basic documentation
- Essential features implemented
- Feature complete
- Internal testing
- Performance optimization
- Documentation drafting
- Core architecture
- Basic feature implementation
- Initial testing setup