Skip to content

Fix: Implement file locking to prevent data races. #19

@Thedrogon

Description

@Thedrogon

Currently, concurrent executions of task can cause data loss due to the non-atomic read-modify-write cycle in delete and edit. I implemented a lockfile mechanism to serialize access to the database.

If you run task done 1 and task add "Buy Milk" at the exact same time in two terminal tabs or a script:

Process A (done) reads the file.

Process B (add) appends "Buy Milk" to the file.

Process A (done) finishes processing and overwrites the file with its old data (which doesn't have "Buy Milk").

Result: "Buy Milk" is lost forever.

fix --> I will implement a File Lock. This ensures that only one instance of task can touch the file at a time. The second instance will wait (or fail) until the first one finishes.

@anikchand461 @abhirajadhikary06 I already have the solution, should I work on this issue???

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions