Skip to content

database.py

InbarShirizly edited this page Aug 14, 2020 · 2 revisions

The file contains the main database of the file. the methods are divided as follows:

  1. init - constructor database connection, with SQL-alchemy methods and prepare tables
  2. **queries **- using the session of the databases and return results from queries
  3. **commit **- insert the new data into the database. the commits include internal queries

The class uses the ORM concept with the SQL-alchemy API:
ORM lets us define tables and relationships using Python classes. also provides a system to query and manipulate the database using object-oriented code instead of writing SQL.

the most important function is insert_user_to_DB(self, user):

to make as less commits as possible we've created a commit_list that collects all the new instance throughout the function and then adds and commits them at once. first, we get the web site id from the data and check if the user location exists in the database - if not will create a new Location instance.

before creating any new instances, we must create a UserT instance and commit it to the database. this way a user with a user.id will be created and we will be able to connect it to all other user information in the different instances.

then we'll create a Reputation instance - will happen for every user. last but not least will be the tag instance - will check if Tag instance already exists - if not - will create a new one. if exists would get that instance. then will form a connection between the Tags and a new User_Tags instance

To avoid duplication of users in the database a unique set of ('name', 'rank', 'website_id') was declared. also, location, tags and website name was declared as unique. duo to the above - before every commit, an exception of IntegrityError will be raised to avid crash. in a case of duplication 2 actions will tack place:

  1. if the user was committed - the user will be deleted from the database.
  2. the user URL will be logged in the logger_not_scrapped to check what happened and scrape him latter if needed also a warning will be logged in the logger_user

Clone this wiki locally