diff --git a/README.md b/README.md index 479d738..67a2aa4 100644 --- a/README.md +++ b/README.md @@ -1,1013 +1,228 @@ # Dirac.js - Postgres ORM Thing -Paul Dirac was a theoretical physicist who made fundamental contributions to the early development of both quantum mechanics and quantum electrodynamics. Dirac.js is a flexible and extendable database layer for Node Postgres. +Paul Dirac was a theoretical physicist who made fundamental contributions to the early development of both quantum mechanics and quantum electrodynamics. Dirac.js is a flexible and extendable database querying library for Node Postgres. -## Database as a Data Structure - -Dirac.js is built on top of [MoSQL](https://github.com/goodybag/mongo-sql), whose primary goal is to provide SQL query construction, but maintain value consistently throughout. This library extends that goal allowing you to reflect on the overall state of your database and retrieve your table structure in semantic JSON. - -Dirac provides you with a decent foundation to start a postgres project with. It allows you to easily group all of your table logic and schema into one file and keep things generally dry and well-namespaced. - -## Features - -* Non-destructive database syncing -* Destructive database syncing -* Standard crud -* Robust JSON queries -* Before/After Middleware -* Easy-to-use database function organization - -## Examples - -High-level _single-query_ relationships helpers: +__Quick Example__ ```javascript -var options = { - // order.user -> {} - // Make the user object available as a sub-json field - one: [ { table: 'users', alias: 'user' } - // order.restaurant -> {} - , { table: 'restaurants', alias: 'restaurant' - // order.restaurant.hours -> [] - // Within the restaurant object, fetch their hours in a json array - , many: [ { table: 'restaurant_hours', alias: 'hours' } ] - } - ] - // order.items -> [] -, many: [ { table: 'order_items', alias: 'items' - // order.items[].price|options - // Technically, just applies a left-join - , mixin: [ { table: 'restaurant_items' - , columns: ['price', 'options'] - } - ] - // order.items[].tags -> [string] - // Pull item tags into an array of strings - , pluck: [ { table: 'restaurant_item_tags' - , column: 'name' - , alias: 'tags' - } - ] - } - ] -}; - -dirac.dals.orders.findone( 123, options, function( error, order ){ - /* ... */ -}); +// Find a random user +db.users.findOne() + // id > 100 + .where({ id: { $gt: 100 } }) + // or id < 50 + .where('id.$or.$lt', 50) + .limit(10) + // user {} will have an array of orders [] + // This works efficientialy without multiple database trips + // and can be optimized by the user + .many('orders') + .execute() + .then( user => console.log( user.orders[0] ) ) + .catch( error => ) ``` -Register a new table with dirac: - -```javascript -var dirac = require('dirac'); - -dirac.register({ - name: 'users' -, schema: { - id: { - type: 'serial' - , primaryKey: true - } - - , groups: { - type: 'int[]' - , references: { - table: 'groups' - , column: 'id' - } - } - - , email: { - type: 'text' - , unique: true - } - - , createdAt: { - type: 'timestamp' - , default: 'now()' - } - } - - // Add your own custom functions -, findOneWithGroups: function( user_id, callback ){ - return this.find( user_id, { - joins: { /* add appropriate joins */ } - }, callback ); - } -}); -``` - -Connect to your database, sync, and query: - -```javascript -dirac.init('postgres://server/my_database'); -// or -dirac.init({ - host: 'localhost' // <- this actually isn't required because host defaults to localhost -, database: 'my_database' -}); - -// Tell dirac to use all of the schemas in `/tables` -dirac.use( dirac.dir( __dirname + '/tables' ) ); +__Install__ -// Creates new tables, performs non-destructive schema changes -db.sync(); // Optionally pass { force: true } to do a complete wipe - -// You do not need to supply a callback. -// You can start querying right away since node-pg -// queues queries until ready -dirac.dals.users.find({ id: { $gt: 5 } }, function(error, users){ - /* ... */ -}); - -// If the first parameter to findOne isn't an object, we assume we're querying by id -// Dirac wraps the value in an object like this: { id: 57 } -dirac.dals.users.findOne( 57, function(error, user){ /* ... */ }); - -// Update user 57, set name = 'poop' returning "users".* -dirac.dals.users.update( 57, { name: "poop" }, { returning: ['*'] }, function(error, users){ - /* ... */ -}); - -// delete from users where name = "poop" returning * -dirac.dals.users.remove( { name: "poop" }, { returning: ['*'] }, function(error, users){ - /* ... */ -}); +```bash +npm install -S dirac ``` -## API - -Dirac has two namespaces: - -* Root -* Database - -The Root namespace is for top-level non-table specific methods while the Databasse namepace is for table specfic methods - -### Root - -#### dirac.init( connStr [options], [options] ) - -Connect to Postgres - -__Arguments:__ - -* Connection String or Options -* Options - - Must contain property called ```connStr``` or ```host```, ```port```, and ```database``` - - Will mix into ```pg.defaults``` - -___Options:___ - -* ```connectionString``` +__Index__ -#### dirac.dropAllTables( [callback] ) +* [Examples](#examples) +* [Documentaiton](#examples) +* [Why Dirac?](#why-dirac) -Drops all tables registered in dirac. +## Why Dirac? -__Arguments:__ +When dirac was first written, the only seemingly viable ORM for node and postgres was Sequelize. For whatever reason, the style and source code did not jive with me. I wanted a simple no-brainer querying interface so I made dirac. I used dirac in production for a couple of small sites and one large site for years, and for the most part, the experience was fantastic. -* Callback ```(error)``` +Now, we've got Knex and Bookshelf which seem to be really great libraries. Still, I'm left with the feeling that the abstractions are needlessly complex and the API doesn't seem as _nice_ as it could be. Admittedly, this is probably because I've been working with my own tools for the past 3 years. I just don't think I could live without [MoSQL](https://github.com/goodybag/mongo-sql) as my query builder. Its relentless obsession with query introspection and conformance to Postgres's SQL engine makes any query possible. -#### dirac.register( name, schema ) +### The Relationships Middleware -Registers a new table or view with dirac. Will not actually create the table until ```dirac.sync()``` is called. Alternatively, you could call: ```dirac.dals.table_name.createIfNotExists()``` to manually add it. However, ```sync``` will resolve table dependencies and it will also save the database state so dirac can reason about your current table structure. - -__Arguments:__ - -* Name - name of the table -* Schema - as described in [https://github.com/goodybag/mongo-sql](https://github.com/goodybag/mongo-sql) create table statement definitions - -__Example:__ +About a year into Gooydbag.com's useage with dirac.js, I found myself needing to fetch sub-resources on top-level queries. They needed to be formatted as JSON documents like so: ```javascript -// Register table -dirac.register({ - name: 'users' -, schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } -}); +// Results: -// Register View -dirac.register({ - name: 'bobs' -, type: 'view' -, query: { - type: 'select' - , table: 'users' - , where: { name: { $ilike: 'bob' } } +[ { id: 11 + , name: 'John' + // Users->Orders + , orders: [ { id: 123 + , user_id: 11 + // Users->Orders->OrderItems + , items: [ { ...} ] + // Users->Orders->Restaurant + , restaurant: { ... } + } + ] } -}); -``` - -#### dirac.sync( options ) - -Perform non-destructive syncs: - -* Add new tables -* Add new columns -* Add column constraints - -Options: - -* force - If true, will perform a destructive sync, thus clearing any orphan columns - -#### dirac.use( middleware ) - -Pass a function to dirac to be called whenever ```dirac.init``` is called. Useful for initializing before/after filters and adding database-wide properties to all schemas. - -__Arguments:__ - -* middleware( dirac ) - the function that will be called when dirac is initialized. It's passed a single parameter (the dirac module). - -__Example:__ - -```javascript -/** - * db/middleware/created-at.js - */ - -var utils = require('utils'); - -// Middleware automatically adds created_at/updated_at fields to schemas -module.exports = function( options ){ - return function( dirac ){ - utils.defaults( options, { - createdAt: { - name: 'created_at' - , type: 'timestamp' - , default: 'now()' - } - , updatedAt: { - name: 'updated_at' - , type: 'timestamp' - , default: 'now()' - } - }); - - // Adds fields to a DAL - var addFields = function( dal ){ - var schema = dirac.dals[ dal ].schema; - - // Add createdAt if it's not already there - if ( !(options.createdAt.name in schema) ){ - schema[ options.createdAt.name ] = options.createdAt; - } - - // Add updatedAt if it's not already there - if ( !(options.updatedAt.name in schema) ){ - schema[ options.updatedAt.name ] = options.updatedAt; - } - }; - - // Before filter adds updatedAt = 'now()' to the update query - var updateFilter = function( $query, schema, next ){ - // Updates may be on values or updates - var values = 'values' in $query ? $query.values : $query.updates; - - values[ options.updatedAt.name ] = 'now()'; - - next(); - }; - - // Registers before filters to update updatedAt - var addFilters = function( dal ){ - dirac.dals[ dal ].before( 'update', updateFilter ); - }; - - // Add fields to each DAL - Object.keys( dirac.dals ).forEach( addFields ) - - // Add filters to each DAL - Object.keys( dirac.dals ).forEach( addFilters ) - }; -}; + ... +] ``` -```javascript -/** - * db/index.js - */ - -var middleware = { - createdAt: require('./middleware/created-at') -}; - -dirac.use( - middleware.createdAt({ - updatedAt: { - name: 'last_updated' - , type: 'timestamp' - , default: 'now()' - } - }) -); - -// DAL registration -// ... -// ... - -// After init is called, all functions specified in use are called -dirac.init( config.db ); -``` - -#### dirac.createTable( ) - -Explicitly create a DALs table. You don't really need to use this unless you're adding new DALs, even then, _you should just call ```sync```_ - -#### dirac.saveCurrentDbState( ) - -Save an entry in the dirac_schemas table of the current DAL state in memory. This happens everytime you call ```sync``` - -#### dirac.setMoSql( instance ) - -Sets dirac's instance of [MoSQL](https://github.com/goodybag/mongo-sql). Useful if you're already using MoSQL in your project. - -### Database - -All table interfaces are accessed through the ```dirac.dals``` namespace. Each table is defined as an instance of Dirac.Dal. - -#### dirac.dals.table_name.find( $query, [options], callback ) - -Select documents in ```table_name```. ```$query``` object is the ```where``` property of a MoSQL object. ```options``` is everything else. - -__Arguments:__ - -* $query - MoSQL conditional query ( select where clause ) -* options - Anything else that would go in a MoSQL query ( limit, offset, groupBy, etc ) -* callback - ```function( error, results ){ }``` - -__Example:__ - -```javascript -// Query where condition -var $query = { - rating: { $gte: 3.5 } -, high_score: { $lt: 5000 } -, name: { $in: [ 'Bob', 'Alice', 'Joe', 'Momma' ] } -}; - -// Other options for the query -var options = { - columns: [ - '*' // users.* - , { // Get average user high_score - type: 'average' // Name of the function - , as: 'average_score' // Name of the column - , expression: 'users.high_score' // Function argument - } - ] -, offset: 50 -, limit: 25 -, order: { column: 'id', direction: 'desc' } -, group: [ 'id', 'name' ] -}; - -dirac.dals.users.find( $query, options, function( error, results ){ - /* ... */ -}); -``` - -#### dirac.dals.table_name.findOne( $query, [options], callback) - -Identical to find only it adds a ```limit: 1``` to the options and will return an object rather than an array. Substitute an ID for $query. - -__Arguments:__ - -* $query - MoSQL conditional query ( select where clause ) or ID -* options - Anything else that would go in a MoSQL query ( limit, offset, groupBy, etc ) -* callback - ```function( error, result ){ }``` - -#### dirac.dals.table_name.remove( $query, [options], callback ) - -Removes a document from the database. Substitute an ID for $query. - -__Arguments:__ - -* $query - MoSQL conditional query ( select where clause ) or ID -* options - Anything else that would go in a MoSQL query ( returning, etc ) -* callback - ```function( error, result ){ }``` - -#### dirac.dals.table_name.update( $query, $update, [options] callback ) - -Update documents in the database. Substitute an ID for $query. - -__Arguments:__ - -* $query - MoSQL conditional query ( select where clause ) or ID -* $update - Object whose keys map to column names and values map to values -* options - Anything else that would go in a MoSQL query ( returning, etc ) -* callback - ```function( error, result ){ }``` - -#### dirac.dals.table_name.insert( document, [options], callback ) - -Insert a doument - -__Arguments:__ - -* document - Object whose keys map to column names and values map to values -* options - Anything else that would go in a MoSQL query ( returning, etc ) -* callback - ```function( error, result ){ }``` - -#### dirac.dals.table_name.before( [fnName], handler... ) - -Add a _before_ filter to the DAL. Before filters are like middleware layers that get run before the query is executed. You can add as long as a chain as you'd like. ```...``` denotes you can add as many handlers as you want. - -__Arguments:__ - -* fnName [optional] - If provided, will add the filter only to the method on the dal, otherwise will add on all methods. -* handler - The logic for your before filter. Will be called withe following arguments: - + $query - The full MoSQL query object along with the values - + schema - The schema for the current table - + next - A function to tell dirac to go the next function in the before stack - (If you pass an argument to ```next```, dirac assumes that it is an - error and will send the value back to the consumers callbaack) - -__Example:__ +I knew how to make the queries that would produce this sort of result (using JSON functions in postgres). The question was whether we could glean the application's database structure and _use that structure_ to make writing queries easier. So I set to work on a dirac plugin that did just that: ```javascript -dirac.register({ - name: 'books' -, schema: { - id: { type: 'serial', primaryKey: true } - , name: { - type: 'text' - - // Dirac doesn't know anything about this object - // So we can use it for our own benefit - , validation: { - type: 'string' - , max_length: 250 - } - } - } -}) - -// Crappy validation -dirac.dals.books.before( 'insert', function( $query, schema, next ){ - if ( typeof $query.values.name != schema.name.validation.type ) - return next({ type: 'VALIDATION_ERROR', message: 'invalid type for `name`' }); - - if ( $query.values.name.length > schema.validation.max_length ) - return next({ type: 'VALIDATION_ERROR', message: 'invalid length for `name`' }); - - /* ... */ +var where = {}; +db.users.find( where, { + many: [ { table: 'orders' + , one: [ { table: 'restaurants', alias: 'restaurant'} ] + , many: [ { table: 'order_items', alias: 'items' } ] + } + ] +}, ( error, users )=>{ + // users[]->orders[]->items[] + // ->restaurant{} }); ``` -#### dirac.dals.table_name.after( [fnName], handler... ) - -Add an _after_ filter to the DAL. After filters are like middleware layers that get run after the query is executed. You can add as long as a chain as you'd like. ```...``` denotes you can add as many handlers as you want. - -__Arguments:__ - -* fnName [optional] - If provided, will add the filter only to the method on the dal, otherwise will add on all methods. -* handler - The logic for your after filter. Will be called withe following arguments: - + results - The results from the query - + $query - The full MoSQL query object along with the values - + schema - The schema for the current table - + next - A function to tell dirac to go the next function in the after stack - (If you pass an argument to ```next```, dirac assumes that it is an - error and will send the value back to the consumers callbaack) +With those query options, each user returned in the results will have an array of orders, which also had an array of items and a restaurant object. The API is easy, and adding behaviors is simple because we're just working with JSON objects. -__Example:__ +The current syntax isn't much different: ```javascript -dirac.register({ - name: 'books' -, schema: { - id: { type: 'serial', primaryKey: true } - , num_words: { - type: 'text' - - // node-pg returns bigints as strings - // Tell casting after filter to cast to a number - , cast: 'number' - } - } -}) - -// Crappy casting -dirac.dals.books.after( 'find', function( results, $query, schema, next ){ - var casts = {}; - for ( var key in schema ){ - if ( 'cast' in schema ) casts[ key ] = schema[ key ][ cast ]; - } - - // Transform result set - for ( var i = 0, l = results.length; i < l; ++i ){ - for ( var key in casts ){ - switch ( casts[ key ] ){ - case 'int': results[ i ][ key ] = parseInt( results[ i ][ key ] ); break; - case 'number': results[ i ][ key ] = parseFloat( results[ i ][ key ] ); break; - case 'string': results[ i ][ key ] = "" + results[ i ][ key ]; break; - default: break; - } - } - } -}); -``` - -### Transactions - -Transactions can be made by created a transaction object via `dirac.tx.create()`. Normally, every query by default uses a pool client and releases it per request. You do not want to release a client back into the pool in the middle of a transaction, because that would be _very, very bad_. - -For transactions, dirac allows you to access the same client to execute multiple queries until you commit or rollback. - -__Example:__ - -```js -var tx = dirac.tx.create(); - -tx.begin(function(err) { - if ( err ) return tx.rollback(); - tx.users.update(userId, balance: { $inc: 5 } }, function(err) { - if ( err ) return tx.rollback(); - tx.users.insert(userId, balance: { $dec: 5 }, function(err) { - if ( err ) return tx.rollback(); - tx.commit(); - }); - }); -}); -``` - -This can be rather unwieldy so you could use a control library or abstract this further: - -```js -var async = require('async') -var tx = dirac.tx.create(); - -async.series([ - tx.begin.bind(tx) -, tx.users.update.bind(tx.users, userId, { balance: { $inc: 5 } }) -, tx.users.update.bind(tx.users, userId, { balance: { $dec: 5 } }) -], function(err) { - if ( err ) return tx.rollback(); // rollback if any queries fail - tx.commit(); -}); -``` - -If you need to apply [explicit table locks](http://www.postgresql.org/docs/9.4/static/explicit-locking.html) -to a transaction, you can use `.lock(mode)` per table: - -```js -async.series([ - tx.begin.bind(tx) -, tx.users.lock.bind(tx, 'ACCESS EXCLUSIVE') -, tx.users.update.bind(tx.users, userId, { name: 'Billy' }) -, tx.commit.bind(tx)]); -``` - -To query the following - -```sql -BEGIN; -LOCK TABLE users IN ACCESS EXCLUSIVE MODE; -UPDATE "users" set "users"."name" = 'Billy'; -COMMIT; +db.users.find() + .many({ + table: 'orders' + , one: [ { table: 'restaurants', alias: 'restaurant'} ] + , many: [ { table: 'order_items', alias: 'items' } ] + }) + .execute() + .then( users => ... ) + .catch( error => ... ) ``` -#### dirac.tx.create() - -Creates a new `tx` object which accesses the same pg.Client for transactional queries. - -#### tx.begin( callback ) - -Invokes a `begin` statement - -#### tx.commit( callback ) - -Invokes the `commit` statement and releases the `tx` client. Subsequent queries will throw an error. +Let's go over what the actual differences between `1.0.0` and older versions: -#### tx.rollback( callback ) +#### No more singletons -If you run into an error you can `rollback` and release the client. Subsequent queries will throw an error. - -#### tx data access - -All dirac.dals are available under the `tx` object. - -__Example__ +Previously, dirac exported a singleton that the user interacted with; Accepting that dirac and postgres were just pieces of their environment that they couldn't truly control. Now, dirac exports factories to its various object definitions. The default export is a function that creates `Database` instances. You can alternatively dig into `lib/` and make your own. ```javascript -var tx = dirac.tx.create(); +var db = require('dirac')('postgres://localhost:5432/test_db'); -tx.users.insert({ name: 'Ben Jammin' }, callback); -tx.restaurants.update(5, { name: 'Uncle Billys' }, callback); +// Alternatively: +var Database = require('dirac/lib/database'); +var db = new Database({ connectionString: 'postgres://localhost:5432/test_db' }); ``` -#### Composing Transactions +#### Immutability by default (it's configurable, though) -Often times, you'll need to create custom functions that operate within its own transaction or part of an outside transaction. This is trivial to support and is outlined in the following example: - -Suppose we want to create a function that atomically deletes existing user groups and saves new ones. +Immutable objects are easier to reason about. Knowing that the instance you're working with won't be mutated by the outside world, and more importantly, that the changes you make won't affect other consumers is extremely important. That's why dirac `1.0.0` started with Immutable primitives. ```javascript -var tx = require('dirac').tx.create(); -tx.begin(); +var db = require('dirac')('postgres://localhost:5432/test_db'); -// Create a user -tx.users.insert({ name: 'Bob' }, function( error, user ){ - if ( error ){ - return tx.rollback(); +// Apply a results transform to the database instance +// db2 !== db; +var db2 = db.after( (results, query) => { + if ( query.table() === 'users' ){ + return results.map( user => new User( user ) ); } - // insert default groups - tx.users.updateGroups( user.id, ['consumer'], function( error ){ - if ( error ){ - return tx.rollback(); - } - - tx.commit(); - }); + return result; }); -``` - -How the DAL method would look: -```javascript -{ name: 'users' -, schema: {...} - -, updateGroups: function( user_id, groups, callback ){ - // If this function is called within the context of an - // existing transaction, the client/transaction object - // will be available under `this.client` - var tx = this.client || dirac.tx.create(); - - async.series([ - // If we're a part of an existing transaction, don't worry - // about writing `begin` - this.client ? async.noop : tx.begin.bind( tx ) - - // Remove existing groups - , tx.user_groups.remove.bind( tx.user_groups, { - user_id: user_id - }) - - // Insert if needed - , groups.length > 0 - ? tx.user_groups.insert.bind( - tx.user_groups - , groups.map( function( group ){ - return { user_id: user_id, group: group }; - }) - ) - : async.noop - - // If we're a part of an existing transaction, don't worry - // about writing `begin` - , this.client ? async.noop : tx.commit.bind( tx ) - ], function( error ){ - if ( error ){ - // If there's an existing transaction, let's not - // automatically rollback - if ( this.client ){ - return callback( error ); - } - - return tx.rollback( callback.bind( null, error ) ); - } - - return callback(); - }.bind( this )); - } -} +// Make all queries originating from trollDB use the `trolls` table +var trollDB = db2.before( query => query.table('trolls') ); ``` -### Configure Data Access - -Dirac exposes mongo-sql and pg instances through `dirac.db`. -This way your database layer can reuse the same connection pool -and data access configurations. - -#### dirac.db.mosql - -The mongo-sql instance - -#### dirac.db.setMosql( mosql ) - -Replaces the mosql object - -__Arguments__ -* mosql - mongo-sql object +Each change to the database object creates a _new_ database instance, inheriting all properties from the original db instance. This underscores the unifying concept in dirac 1.0.0; Query Generators. -#### dirac.db.pg +#### Query Generators, or everything through the query object -The node-pg instance +The [Query Object](Link to query docs) is the primary interface to do _all_ database work. You could remove every other concept in the library and the Query object would still work quite nicely. Things like the Database and the Table objects are just a means to creating Query objects. They're a class of factories that make queries that inherit useful members from its parent. -#### dirac.db.setPg( pg ) - -Replaces the node-pg object - -__Arguments__ - -* pg - node-pg object - -#### Example of setting dirac.db.pg +```javascript +var db = require('dirac')('postgres://localhost:5432/test_db'); -```js -// Customizing pg so we parse timestamps into moment objects -var pg = require('pg'); -var dirac = require('dirac'); +// query inherits db's connection string. +var query = db.query().table('users').where('id', 1); -var timestampOid = 1114; -var parseTimestamp = function(str) { - return moment(str); -} +var trollDB = db2.before( query => query.table('trolls') ); -pg.types.setTypeParser(timestampOid, parseTimestamp); +// Troll query inherits trollDB's query transforms +// when trollQuery is executed, it will run a transformed version of itself +var trollQuery = trollDb.query().table('users').where('name', { $lt: 'bob' }); -// Now abstractions such as dirac can reuse the same pg. -dirac.db.setPg( pg ); -``` +var users = db.table('users'); +var trolls = trollDb.table('users' /* anything could go here, + because it'll be transformed */ ); -## Examples +// Querying from users comes with some defaults +var usersQuery = users.query(); +assert.equal( usersQuery.table(), 'users' ); -### Getting Started +// So does the troll users table +var trollQuery = trolls.query(); +// This is still 'users', until we execute and get the transformed version +assert.equal( trollQuery.table(), 'users' ); +assert.equal( trollQuery.getTransformedQuery().table(), 'trolls' ); -Directory layout: +// Table QueryGenerator also comes with other query factories: +var user123Query = users.findOne(123).many('orders'); +// Remove user 123, returning the user object and their orders in a JSON array +var removeUserQuery = users.remove(123).returning({ many: [ table: 'orders' ] }) +// Or just use the Query by itself: +var Query = require('dirac/lib/query'); +// Find user 123, returning the first result +var query = Query + .create( {}, { connectionString: 'postgres://localhost:5432/test_db' }) + .where('id', 123) + .after( results => results[0] ); ``` -- my-app/ - - db/ - - tables/ - - table_1.js - - table_2.js - - table_3.js - - index.js -``` - -__index.js:__ -```javascript -/** - * db.js -**/ -var dirac = require('dirac'); -var config = require('../config'); - -// Tell dirac to use all of the schemas in `/tables` -dirac.use( dirac.dir( __dirname + '/tables' ) ); - -dirac.init( config.db ); - -// Get our database schemas up to date -// This will add any tables and columns -dirac.sync(); +#### Promises by default -// Expose dals on base db layer so I can do something like: -// db.users.findOne( 7, function( error, user){ /* ... */ }); -for ( var k in dirac.dals ) module.exports[ k ] = dirac.dals[ k ]; -``` - -__table_1.js:__ +Promises are nice, except when some part of your environment doesn't use promises. I've found it easier to work with promises within your callback-oriented code than it is to work with callbacks in your promise-oriented code. I'm not sure why this is (I'm well aware of the ability to automatically convert between the two interfaces with Bluebird's API). ```javascript -/** - * table_1.js - * Export a JavaScript object containing the - * table.name, table.schema -**/ -module.exports = { - name: 'table_1' -, schema: { - id: { type: 'serial', primaryKey: true } - , name: { type: 'text' } - , content: { type: 'text' } - } -, last_updated: { - type: 'date' - , withoutTimezone: true - , default: 'now()' - } -}; +db.users.findOne(123) + .many('orders') + .one({ table: 'regions', alias: 'region' }) + .execute() + .then( user => console.log( 'user.id', user.id ) ) + .catch( e => console.error( e ) ); ``` -### Querying +#### Relationships on be default -One of the nicest parts about dirac is its robust querying DSL. Since it's built on top of MoSQL, we get to take advantage of a fairly complete SQL API. +In previous versions, you had to explicitly use the relationships middleware (the ability to use many/one/pluck/mixin in queries). Now, the default export on dirac is a database factory that automatically includes the relationships middleware. -__Find a single user by id:__ +#### `.register(...)` is totally optional, and it builds a dependency graph for you -```javascript -dirac.dals.users.findOne( 7, function( error, user ){ /* ... */ }); -``` +Previously, all interactions through dirac had to be performed after setting up DALs through the `dirac.register(...)` method. Now, using `.register(...)` is optional, unless you want to use the relationships middleware. -__Find a user, join on groups and aggregate into array:__ +As you register tables on your database instance, the dependency graph is updated and available as `db.graph`. ```javascript -var options = { - columns: [ - // Defaults to "users".* - '*' - - // Columns can have sub-queries and expressions like this array_agg function - , { type: 'array_agg', expression: 'groups.name', as: 'groups' } - ] - - // Specify all joins here -, joins: { - groups: { - type: 'left' - , on: { 'user_id': '$users.id$' } +var db = require('dirac')() + .register({ + name: 'users' + , schema: { + id: { type: 'serial', primaryKey: true } } - } -} - -// select "users".*, array_agg( groups.name ) as "groups" from "users" -// left join "groups" on "groups"."user_id" = "users"."id" -// -// Now the user object will have an array of group names -dirac.dals.users.findOne( 7, function( error, user ){ /* ... */ }); -``` - -__Sub-Queries:__ - -You can put sub-queries in lots of places with dirac/MoSQL + }); -```javascript -var options = { - // Construct a view called "consumers" - with: { - consumers: { - type: 'select' - , table: 'users' - , where: { type: 'consumer' } - } - } -} - -var $query = { - name: { $ilike: 'Alice' } -, id: { - $in: { - type: 'select' - , table: 'consumers' - , columns: ['id'] - } +var orders = db.table({ + name: 'orders' +, schema: { + id: { type: 'serial', primaryKey: true } + , user_id: { type: 'int', references: { table: 'users', column: 'id' } } } -}; - -dirac.dals.users.find( $query, options, function( error, consumers ){ /* ... */ }); -``` - -## Built-in Middleware - -Dirac has the following built-in middleware modules: - -* Cast To JSON -* Dir -* Embeds -* [Relationships](#middleware-relationships) -* Table Ref - -### Middleware Relationships - -The relationships middleware allows you to easily embed foreign data in your result set. Dirac uses the schemas defined with `dirac.register` to build a dependency graph of your schemas with pivots on foreign key relationships. It uses this graph to build the proper sub-queries to embed one-to-one or one-to-many type structures. - -__Relationships Directives__ - -* [One](#relationships-one) -* [Many](#relationships-many) -* [Pluck](#relationships-pluck) -* [Mixin](#relationships-mixin) - -__Full Blown Example:__ - -```javascript -var dirac = require('dirac'); - -// Make sure to call relationships before registering tables -dirac.use( dirac.relationships() ); -dirac.use( dirac.dir( __dirname + '/tables' ) ); - -dirac.init( config ); - -// Embed Order Items as an array on the order -// Embed user and restaurant as json objects -var options = { - many: [ { table: 'order_items' - , alias: 'items' - // Automatically do the left join to get item options - , mixin: [{ table: 'order_item_options' }] - } - ] -, one: [ { table: 'restaurants' - , alias: 'restaurant' - } - , { table: 'users' - , alias: 'user' - // Automatically pull out just an array of group names - // that apply to the user object - , pluck: [ { table: 'groups', column: 'name' } ] - } - ] -}; - -dirac.dals.orders.findOne( user.id, options, function( error, order ){ - // Array.isArray( order.items ) => true - // typeof order.restaurant === 'object' => true - // typeof order.user === 'object' => true - // Array.isArray( order.user.groups ) => true }); -``` - -This is all done with a single query to the database without any result coercion. -The relationships middleware is applicable to writes, using the `returning` statement: +db = db.register( orders ); -```javascript -// Pull in the restaurant and user objects after updating -var options = { - one: [ { table: 'restaurants' - , alias: 'restaurant' - } - , { table: 'users' - , alias: 'user' - // Automatically pull out just an array of group names - // that apply to the user object - , pluck: [ { table: 'groups', column: 'name' } ] - } - ] -}; +var ordersQuery = db.orders + .find() + .one({ table: 'users', alias: 'user' }); -db.orders.update( 123, { status: 'pending'}, options, function( error, order ){ - // order.restaurant => {...} - // order.user => {...} -}); -``` - -#### Relationships: One - -Applies a one-to-one relationship on the query: - -```javascript -// orders (id, user_id, ...) -// users (id, ...) -dirac.dals.orders.find( {}, { - one: [{ table: 'users', alias: 'user' }] -}) - -// [{ id: 1, user: { ... } }, { id: 2, user: { ... } }] -``` - -#### Relationships: Many - -Applies a one-to-many relationship on the query: - -```javascript -// users (id, ...) -// orders (id, user_id, ...) -dirac.dals.users.findOne( 32, { - many: [{ table: 'orders', alias: 'orders' }] -}) - -// { id: 32, orders: [{ id: 1, user_id: 32, ... }, { id: 2, user_id: 32, ... }] } -``` - -#### Relationships: Pluck - -Like [Many](#relationships-many), but maps on a single field: - -```javascript -// users (id, ...) -// groups (id, user_id, name, ...) -dirac.dals.users.findOne( 32, { - pluck: [{ table: 'groups', column: 'name' }] -}) - -// { id: 32, groups: ['client', 'cool-guys'] } -``` - -#### Relationships: Mixin - -Like [One](#relationships-one), but mixes in the properties from the target into the source (basically a more abstract join). - -```javascript -// Useful for junction tables -// user_invoices (id, user_id, ...) -// user_invoice_orders (id, user_invoice_id, order_id, ...) -// orders (id, ...) -db.user_invoices.findOne( 1, { - many: [ { table: 'user_invoice_orders' - , alias: 'orders' - , mixin: [ { table: 'orders' - , columns: [ 'restaurant_id' - , { name: 'id', table: 'orders', alias: 'order_id' } - ] - } - ] - } - ] -}) - -// { id: 1, orders: [{ id: 1, user_invoice_id: 1, user_id: 1, restaurant_id, order_id }, ... ] } +var usersQuery = db.users + .findOne(123) + .many('orders'); ``` \ No newline at end of file diff --git a/index.js b/index.js index 30df396..759b940 100644 --- a/index.js +++ b/index.js @@ -1,13 +1,18 @@ -var dirac = module.exports = require('./lib/dirac'); -var utils = require('./lib/utils'); -var diracSchema = require('./lib/dirac-table'); +const Database = require('./lib/database'); +const relationships = require('./middleware/relationships'); +const Query = require('./lib/query'); +const Table = require('./lib/query'); -// Database Access -dirac.db = require('./lib/db'); +module.exports = options => { + if ( typeof options === 'string' ){ + options = { connectionString: options }; + } -// Middleware -dirac.tableRef = require('./middleware/table-ref'); -dirac.castToJSON = require('./middleware/cast-to-json'); -dirac.embeds = require('./middleware/embeds'); -dirac.dir = require('./middleware/dir'); -dirac.relationships = require('./middleware/relationships'); + return Database + .create( options ) + .use( relationships() ); +}; + +module.exports.relationships = relationships; +module.exports.query = ( q, options )=> Query.create( q, options ); +module.exports.table = options => Table.create( options ); \ No newline at end of file diff --git a/inspiration.js b/inspiration.js new file mode 100644 index 0000000..4b814b5 --- /dev/null +++ b/inspiration.js @@ -0,0 +1,191 @@ +var db = require('dirac')({ connectionString }) + +db.users = db.table('users') + .stream({ concurrency: 10 }) + .where({ id: { $gt: 100 } }) + .map() + +// + +var db = require('dirac')({ connectionString: '...' }) + +db.users = db.dal({ + name: 'users' +, schema: {...} +}) + +db.users + .find() + .where({ id: { $gt: 100 } }) + .where('id.$or.$lt', 500) + .exec() + .then( users => ... ) + .catch( error => ... ) + +// ... +( db )=>{ + db.before +} + +db.users.findOne(100) + .options({ + many: [ { table: 'orders' + , many: [ ... ] + } + ] + }) + +db.users.findOne(100) + .many({ table: '' }) + +// + +db.tx().begin() + // Update book 123 + .then( tx => { + return db.books.update(123) + .values({ likes: { $inc: 1 } }) + .returning('*') + .exec( tx ) + }) + // Update the book author's total likes + .then( tx => { + // Transaction results are accumulated on `tx.results` + // We know the result is a single object since the operation + // was updating the table via the singular condition by primary key + return db.authors.update( tx.results.book.author_id ) + .values({ total_likes: { $inc: 1 } }) + .returning('*') + .exec( tx ) + }) + .then( tx => tx.commit() ) + .catch( error => tx.abort() ) + .then( results => { + res.json( results.book ); + }); + +// + +var query = users + .findOne(123) + // Find orders in parallel + .and( orders.find().where({ user_id: 123 }) ) + // Results accumulate on an object + // Map the result to just the user object + .map( result => { + result.user.orders = result.orders; + return result.user; + }) + // After that, insert something requiring the user data + .then( user => user_thing.insert({ foo: 'bar', user }) ) + +db.execute( query ) + .then( user => res.json( user ) ) + .catch( error => res.error( error ) ) + +// + +var db = require('dirac')() + // Use .register to automatically add the table to + // the database instance (each register call creates + // a new instance unless options.immutable: false). + // As new tables are registered, the dependency graph + // is updated + .register( + db.table({ + name: 'users' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + }) + // If you want to map all results from this query factory to + // an application model, you could do something like this: + .after( users => { + if ( Array.isArray( users ) ){ + return users.map( user => UserModel.create( user ) ) + } + + // Otherwise, they used a method that returned one result + return UserModel.create( users ) + }) + ) + // Optionally, just pass in the table definition to register + .register({ + name: 'books' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + }) + .register({ + name: 'user_books' + , schema: { + user_id: { type: 'int', references: { table: 'users', column: 'id' } } + , book_id: { type: 'int', references: { table: 'books', column: 'id' } } + } + }) + +// Remove user 123 returning the user object +// and the user's books +var query = db.users + .remove(123) + .returning('*') + .many( + db.query() + .table('user_books') + .alias('books') + .mixin('books') + ); + +// Note: user instanceof UserModel === true +query.exec().then( user => console.log( user.books[0].name ) ) + +// Notes on transactions +// +db.transaction() + .begin( tx => { + + tx.query( ) + .then( ()=> { + + + + }) + .then( ()=> tx.commit() ) + }) + +// Transactions should behave more like Queries +// where calling .query(..) simply adds queries to the transaction +// building up a transaction plan +// and .execute() would execute that plan +var createUser = user => { + return db.transaction() + // Insert base user object + .query( tx => db.users.insert(user) ) + // Insert user's groups with the new user_id assigned + .query( tx => { + let groups = user.groups.map( group => { + return Object.assign( { user_id: tx.user.id }, group ) + }) + + return tx.query( db.user_groups.insert(groups) ) + }) +} + +var createOrder = order => { + return db.transaction() + // Insert base order + .query( tx => db.orders.insert(order) ) + // Also, update the order_statuses table + // This is a bit of a contrived example since you'd probably do this with triggers + .query( tx => db.order_statuses.insert({ order_id: tx.order.id, })) +} + +// This way, you can compose transactions +createUser(user) + .query( tx => createOrder( Object.assign(order, { user_id: tx.user.id } ) ) ) + .execute() + .then( results => { + console.log( results.user, results.order ); + }) \ No newline at end of file diff --git a/lib/alter-table-sync-strategies.js b/lib/alter-table-sync-strategies.js deleted file mode 100644 index e269acf..0000000 --- a/lib/alter-table-sync-strategies.js +++ /dev/null @@ -1,107 +0,0 @@ -var strategies = module.exports = {}; -strategies.registered = {}; - -/** - * Is a strategy registered - * @param {string} strat The strategy in question - * @return {Boolean} Whether it's registered - */ -strategies.has = function( strat ){ - return strat in strategies.registered; -}; - -/** - * Returns the strategy interface - * @param {string} srat The strategy in question - * @return {Object} The strategy interface - */ -strategies.get = function( strat ){ - return strategies.registered[ strat ]; -}; - -/** - * Register a new strategy - * @param {string} name Name of the strategy - * @param {object} options Optional options to be associated - * @param {Function} fn strategy definition: - * callback( name, newCol, oldCol, newSchema, oldSchema ) - */ -strategies.register = function( name, options, fn ){ - if ( typeof options === 'function' ){ - fn = options; - options = {}; - } - - strategies.registered[ name ] = { - options: options - , fn: fn - }; -}; - -/** - * Built-in Strategies - */ - -strategies.register( 'primaryKey', function( name, newCol, oldCol, newSchema, oldSchema, table ){ - // Column didn't previously have pkey - // So drop an old one, add the new one - if ( !('primaryKey' in oldCol) ){ - return [ - { // Drop old pkey - dropConstraint: { - name: table + '_pkey' - , ifExists: true - , cascade: true - } - } - , { // Add new pkey - addConstraint: { - name: table + '_pkey' - , primaryKey: name - } - } - ]; - } -}); - -strategies.register( 'default', function( name, newCol, oldCol, newSchema, oldSchema, table ){ - // Was in old, not in new - drop default - if ( 'default' in oldCol && !('default' in newCol) ){ - return { - alterColumn: { - name: name - , dropDefault: true - } - } - } - - if ( 'default' in newCol && !('default' in oldCol) ){ - return { - alterColumn: { - name: name - , default: newCol.default - } - }; - } -}); - -strategies.register( 'unique', function( name, newCol, oldCol, newSchema, oldSchema, table ){ - if ( 'unique' in oldCol && !('unique' in newCol) ){ - return { - dropConstraint: { - name: [ table, name, 'key' ].join('_') - , ifExists: true - , cascade: true - } - }; - } - - if ( !('unique' in oldCol) && 'unique' in newCol ){ - return { - addConstraint: { - name: [ table, name, 'key' ].join('_') - , unique: [ name ] - } - }; - } -}); \ No newline at end of file diff --git a/lib/class.js b/lib/class.js deleted file mode 100644 index 28ce0c7..0000000 --- a/lib/class.js +++ /dev/null @@ -1,63 +0,0 @@ -/* Simple JavaScript Inheritance - * By John Resig http://ejohn.org/ - * MIT Licensed. - */ - -// Inspired by base2 and Prototype -var initializing = false, fnTest = /xyz/.test(function(){xyz;}) ? /\b_super\b/ : /.*/; - -// The base Class implementation (does nothing) -var Class = module.exports = function(){}; - -// Create a new Class that inherits from this class -Class.extend = function(prop) { - var _super = this.prototype; - - // Instantiate a base class (but only create the instance, - // don't run the init constructor) - initializing = true; - var prototype = new this(); - initializing = false; - - // Copy the properties over onto the new prototype - for (var name in prop) { - // Check if we're overwriting an existing function - prototype[name] = typeof prop[name] == "function" && - typeof _super[name] == "function" && fnTest.test(prop[name]) ? - (function(name, fn){ - return function() { - var tmp = this._super; - - // Add a new ._super() method that is the same method - // but on the super-class - this._super = _super[name]; - - // The method only need to be bound temporarily, so we - // remove it when we're done executing - var ret = fn.apply(this, arguments); - this._super = tmp; - - return ret; - }; - })(name, prop[name]) : - prop[name]; - } - - // The dummy class constructor - function Class() { - // All construction is actually done in the init method - if ( !initializing && this.initialize ) - this.initialize.apply(this, arguments); - } - - // Populate our constructed prototype object - Class.prototype = prototype; - - // Enforce the constructor to be what we expect - Class.prototype.constructor = Class; - - // And make this class extendable - Class.extend = arguments.callee; - - return Class; -}; \ No newline at end of file diff --git a/lib/client-options.js b/lib/client-options.js new file mode 100644 index 0000000..25fec1a --- /dev/null +++ b/lib/client-options.js @@ -0,0 +1,51 @@ +const parseConnectionString = require('pg-connection-string').parse; + +class ClientOptions { + static parseConnectionString( str ){ + return parseConnectionString( str ); + } + + constructor( options = {} ){ + if ( typeof options === 'string' ){ + options = ClientOptions.parseConnectionString( options ); + } else if ( typeof options === 'object' && options.connectionString ){ + options = ClientOptions.parseConnectionString( options.connectionString ); + } else if ( typeof options !== 'object' ){ + throw new InvalidClientOptionsError( typeof options ); + } + + this.host = options.host || 'localhost'; + this.port = options.port || 5432; + this.user = options.user || process.env.USER; + this.password = options.password || null; + this.database = options.database || process.env.USER; + this.ssl = typeof options.ssl === 'boolean' ? options.ssl : false; + } + + toString(){ + return 'postgres://' + [ + this.user ? this.user : '' + , this.password ? `:${this.password}` : '' + , this.user ? '@' : '' + , this.host + , this.port ? `:${this.port}` : '' + , '/' + , this.database + , this.ssl ? '?ssl=true' : '' + ].reduce( ( result, part ) => result + part, '' ); + } +} + +class InvalidClientOptionsError extends Error { + constructor( type ){ + super(`Invalid options type: \`${type}\` passed to \`ClientOptions\` constructor`); + } +} + +ClientOptions.props = [ + 'host', 'port', 'user', 'password', 'database', 'ssl' +]; + +ClientOptions.InvalidClientOptionsError = InvalidClientOptionsError; + +module.exports = ClientOptions; \ No newline at end of file diff --git a/lib/dal.js b/lib/dal.js deleted file mode 100644 index eb325c7..0000000 --- a/lib/dal.js +++ /dev/null @@ -1,295 +0,0 @@ -/** - * Base Data Access Class - */ - -var utils = require('./utils'); -var pg = require('./db').pg; -var mosql = require('./db').mosql; -var async = require('async'); - -var validReturningQueryTypes = [ - 'select', 'union', 'intersect', 'except' -]; - -module.exports = utils.Class.extend({ - initialize: function(connString){ - - if ( typeof connString === 'string') { - this.connString = connString; - } - - if ( connString instanceof pg.Client ) { - this.client = connString; - } - - this.table = this.name; - - // Holds queries while sync completes - this.syncQueue = []; - - this.befores = {}; - this.afters = {}; - return this; - } - -, raw: function raw(query, values, callback){ - if ( typeof values == 'function' ){ - callback = values; - values = []; - } - - callback = callback || utils.noop; - - // Use provided client or use a pool client - if ( !this.client && !this.connString ) throw new Error ('No client provided'); - - if ( this.client ) { - this.client.query( query, values, callback ); - return this; - } - - if ( this.connString ) { - pg.connect( this.connString, function( error, client, done ){ - if (error) return callback( error ); - - client.query( query, values, function( error, result ){ - done(); - - callback( error, result ); - }); - }); - } - - return this; - } - -, query: function query($query, callback){ - var this_ = this, caller = arguments.callee.caller.name; - - callback = callback || utils.noop; - - if ( this.isSyncing && !$query.ignoreSync ){ - return this.syncQueue.push( function(){ - this_.query( $query, callback ); - }); - } - - this.runBeforeFilters( caller, $query, function( error ){ - if ( error ) return callback( error ); - - var query = mosql.sql( $query ); - - return this_.raw( query.toString(), query.values, function( error, result ){ - if ( error ) return callback( error ); - - if ( validReturningQueryTypes.indexOf( $query.type ) > -1 || $query.returning ){ - return this_.runAfterFilters( caller, result.rows, $query, callback ); - } - - callback(); - }); - }); - } - -, insert: function insert(values, options, callback){ - if (typeof options == 'function'){ - callback = options; - options = null; - } - - var query = { - type: 'insert' - , table: this.table - , values: values - , returning: ['*'] - }; - - if (options){ - for (var key in options) query[key] = options[key]; - } - - return this.query(query, callback); - } - -, find: function find(where, options, callback){ - if (typeof options == 'function'){ - callback = options; - options = {}; - } - - var query = { - type: 'select' - , table: this.table - , where: where - }; - - for (var key in options) query[key] = options[key]; - - return this.query(query, callback); - } - -, findOne: function findOne(id, options, callback){ - if (typeof options == 'function'){ - callback = options; - options = {}; - } - - var where; - - if (typeof id != 'object') where = { id: id }; - else where = id; - - var query = { - type: 'select' - , table: this.table - , where: where - , limit: 1 - }; - - for (var key in options) query[key] = options[key]; - - return this.query(query, function(error, results){ - callback(error, results && results.length > 0 ? results[0] : null); - }); - } - -, update: function update($query, $update, options, callback){ - if (typeof options == 'function'){ - callback = options; - options = {}; - } - - if (typeof $query != 'object') $query = { id: $query }; - - var query = { - type: 'update' - , table: this.table - , where: $query - , updates: $update - }; - - for (var key in options) query[key] = options[key]; - - return this.query(query, callback); - } - -, remove: function remove(id, options, callback){ - if (typeof options == 'function'){ - callback = options; - options = null; - } - - var query = { - type: 'delete' - , table: this.table - , where: typeof id == 'object' ? id : { id: id } - , returning: ['*'] - }; - - if (options){ - for (var key in options) query[key] = options[key]; - } - - return this.query(query, callback); - } - -, createIfNotExists: function createIfNotExists(callback){ - var query = { - type: 'create-table' - , table: this.table - , ifNotExists: true - , definition: this.schema - , ignoreSync: true - }; - - return this.query(query, callback); - } - -, dropTable: function dropTable(options, callback){ - if (typeof options == 'function'){ - callback = options; - options = {}; - } - - var query = { - type: 'drop-table' - , table: this.table - , ifExists: true - , ignoreSync: true - }; - - if (options){ - for (var key in options) query[key] = options[key]; - } - - return this.query(query, callback); - } - -, lock: function lock(mode, callback) { - if (typeof mode == 'function'){ - callback = mode; - mode = null; - } - - var query = 'LOCK TABLE ' + this.table + (mode ? ' IN ' + mode + ' MODE' : ''); - return this.raw(query, callback); - } - -, before: function( fnName, handlers ){ - if ( typeof this[ fnName ] != 'function' ) - throw new Error('Dirac.DAL.before cannot find function `' + fnName + '`'); - - handlers = Array.prototype.slice.call( arguments, 1 ); - - if ( !this.befores[ fnName ] ) this.befores[ fnName ] = handlers; - else this.befores[ fnName ] = this.befores[ fnName ].concat( handlers ); - - return this; - } - -, after: function( fnName, handler ){ - if ( typeof this[ fnName ] != 'function' ) - throw new Error('Dirac.DAL.after cannot find function `' + fnName + '`'); - - handlers = Array.prototype.slice.call( arguments, 1 ); - - if ( !this.afters[ fnName ] ) this.afters[ fnName ] = handlers; - else this.afters[ fnName ] = this.afters[ fnName ].concat( handlers ); - } - -, runBeforeFilters: function( type, $query, callback ){ - if ( !(type in this.befores) ) return callback ? callback() : null, this; - - var this_ = this; - - async.series( - // Prepare befores to be async-series ready - this.befores[ type ].map( function( handler ){ - return function( done ){ - handler( $query, this_.schema, done ); - } - }) - // On series completion - , callback - ); - } - -, runAfterFilters: function( type, results, $query, callback ){ - if ( !(type in this.afters) ) return callback ? callback( null, results ) : null, this; - - var this_ = this; - - async.series( - // Prepare afters to be async-series ready - this.afters[ type ].map( function( handler ){ - return function( done ){ - handler( results, $query, this_.schema, done ); - } - }) - // On series completion - , function( error ){ - callback( error, results ); - } - ); - } -}); diff --git a/lib/database.js b/lib/database.js new file mode 100644 index 0000000..61c1ec9 --- /dev/null +++ b/lib/database.js @@ -0,0 +1,234 @@ +var _ = require('lodash'); +var PGPool = require('pg-pool'); + +var Table = require('./table'); +var Query = require('./query'); +var Transaction = require('./transaction'); +var ClientOptions = require('./client-options'); +var QueryGenerator = require('./query-generator'); +var RawQueryExecutor = require('./raw-query-executor'); +var createTableGraph = require('./table-graph'); + +class Database extends QueryGenerator { + static create( options ){ + return new Database( options ); + } + + constructor( options = {} ){ + if ( typeof options === 'string' ){ + options = { connectionString: options }; + } + + var clientOptions = options.clientOptions || new ClientOptions( options ); + + options.pool = options.pool || new PGPool( clientOptions ); + + super( options ); + + this.Table = options.Table || Table; + this.Query = options.Query || Query; + this.tables = {}; + this.clientOptions = clientOptions; + } + + clone(){ + var options = Object.assign( {}, this.options ); + + options.queryTransforms = this.queryTransforms.slice(0); + options.resultsTransforms = this.resultsTransforms.slice(0); + options.Table = this.Table; + options.Query = this.Query; + options.pool = this.pool; + + // Always share the initial pool passed in + options.pool = this.pool; + + var tables = Object + .keys( this.tables ) + .reduce( ( newTables, key ) => { + // For now, don't worry about cloning + // newTables[ key ] = this.tables[ key ].clone(); + newTables[ key ] = this.tables[ key ]; + return newTables; + }, {} ); + + var database = this.constructor.create( options ); + + database.tables = tables; + database.graph = _.cloneDeep( this.graph ); + database.clientOptions = new ClientOptions( database.clientOptions ); + + for ( var key in tables ){ + database.defineGetterAndSettersForTable( tables[ key ] ); + } + + return database; + } + + transaction(){ + return Transaction.create({ pool: this.pool, debug: this.options.debug }); + } + + table( table, TableClass ){ + TableClass = TableClass || this.Table; + + var options = this.getCreateQueryOptions(); + + if ( typeof table === 'string' ){ + options.name = table; + return TableClass.create( options ); + } + + return TableClass.create( Object.assign( options, table ) ); + } + + /** + * Given an instanceof Table, create a new version of the table + * with all of `this` instance's properties + * @param {Table} table + * @return {Table} + */ + adaptTable( table ){ + return table.clone().mutate( table => { + const options = this.getCreateQueryOptions() + + options.queryTransforms = options.queryTransforms.concat( + table.queryTransforms + ) + + options.resultsTransforms = options.resultsTransforms.concat( + table.resultsTransforms + ) + + table.options.debug = this.options.debug + table.pool = table.options.pool = this.pool + + Object.assign( table, options ) + }); + } + + register( table ){ + var database = this.instance(); + + if ( !(table instanceof Table ) ){ + table = database.table( table ); + } else { + table = database.adaptTable( table ); + } + + database.tables[ table.name ] = table; + database.defineGetterAndSettersForTable( table ); + + database.graph = createTableGraph( database.tables ); + + return database; + } + + defineGetterAndSettersForTable( table ){ + Object.defineProperty( this, table.name, { + get: ()=> this.tables[ table.name ] + , set: table => this.tables[ table.name ] = table + , enumerable: true + }); + } + + createOrReplaceTables(){ + return this.transaction() + .begin() + .then( tx => { + var createTablesQuery = this.getCreateTableAndViewList() + .map( table => this.tables[ table ] ) + .map( table => table.create() ) + .map( query => tx.query( query ) ); + + return Promise + .all( createTablesQuery ) + .then( values => tx.commit() ) + .catch( e => { + console.error(e) + return tx.abort() + }); + }) + } + + // Adapted from legacy and realllly should be refactored + getCreateTableAndViewList(){ + // Determine order that tables need to be created + var ordered = [], column; + + // Represent the references as a directed acyclic graph + var graph = Object + .keys( this.graph ) + .reduce( ( g, table )=>{ + g[ table ] = { + dependencies: Object.keys( this.graph[ table ].dependencies ) + , incoming: Object.keys( this.graph[ table ].dependents ) + }; + + return g; + }, {} ); + + // Get set of nodes with no edges + var notDependedOn = []; + for (var table in graph){ + if (graph[table].incoming.length == 0) notDependedOn.push(table); + } + + // Perform topological sort on DAG + var table, node; + while (table = notDependedOn.pop()){ + ordered.unshift( table ); + + // Table has no dependencies, so it doesn't matter where it is + if (graph[table].dependencies.length == 0) continue; + + // Remove edges from table to dependencies + while ( graph[table].dependencies.length > 0 ){ + node = graph[table].dependencies.pop(); + graph[node].incoming = graph[node].incoming.filter(function(t){ + return t != table; + }); + + if ( graph[node].incoming.length == 0 ){ + notDependedOn.push( node ); + } + } + } + + var treeIsCyclic = Object + .keys( graph ) + .filter( table => { + return graph[ table ].incoming.length > 0 || graph[ table ].dependencies.length > 0; + }).length > 0; + + if ( treeIsCyclic ){ + throw new CyclicDependencyError(); + } + + return ordered; + } +} + +Object.assign( Database.prototype, RawQueryExecutor ); + +// Setup getters/setters for ClientOptions +ClientOptions.props.forEach( prop => { + Database.prototype[ prop ] = function( val ){ + if ( val === undefined ){ + return this.clientOptions[ prop ]; + } + + var this_ = this.instance(); + this_.clientOptions[ prop ] = val; + this_.pool.options[ prop ] = val; + return this_; + } +}); + +module.exports = Database; + +class CyclicDependencyError extends Error { + constructor(){ + super('Dependency tree is cyclic'); + } +} \ No newline at end of file diff --git a/lib/db.js b/lib/db.js deleted file mode 100644 index 2be0676..0000000 --- a/lib/db.js +++ /dev/null @@ -1,26 +0,0 @@ -/** - * Database access - */ - -var mosql = require('mongo-sql'); -var pg = require('pg'); - -var Db = function( opts ){ - opts = opts || {}; - this.mosql = opts.mosql || mosql; - this.pg = opts.pg || pg; -} - -Db.prototype.setMosql = function( instance ){ - this.mosql = instance; -} - -Db.prototype.setPg = function( instance ){ - this.pg = instance; -} - -/** - * Expose a db object - */ - -module.exports = new Db(); diff --git a/lib/dirac-table.js b/lib/dirac-table.js deleted file mode 100644 index faa4b41..0000000 --- a/lib/dirac-table.js +++ /dev/null @@ -1,44 +0,0 @@ -module.exports = { - name: 'dirac_schemas' -, schema: { - id: { - type: 'serial' - , primaryKey: true - } - - , tableName: { - type: 'text' - } - - , schema: { - type: 'json' - } - - , version: { - type: 'int' - } - - , createdAt: { - type: 'timestamp' - , default: 'now()' - } - } - -, findLatest: function( callback ){ - var $latestVersionQuery = { - type: 'select' - , table: this.table - , columns: [ 'max(version) as latest' ] - }; - - var $query = { - type: 'select' - , table: [this.table, 'versions'] - , columns: ['*'] - , with: { versions: $latestVersionQuery } - , where: { version: '$versions.latest$' } - }; - - return this.query( $query, callback ); - } -}; \ No newline at end of file diff --git a/lib/dirac.js b/lib/dirac.js deleted file mode 100644 index aabc904..0000000 --- a/lib/dirac.js +++ /dev/null @@ -1,489 +0,0 @@ -var - dirac = module.exports = {} -, EventEmitter = require('events').EventEmitter -, pg = require('./db').pg -, mosql = require('./db').mosql -, utils = require('./utils') -, DAL = require('./dal') -, atsStrats = require('./alter-table-sync-strategies') -; - -for ( var key in EventEmitter.prototype ){ - dirac[ key ] = EventEmitter.prototype[ key ]; -} - -dirac.DAL = DAL; - -dirac.Dals = {}; -dirac.dals = {}; -dirac.views = {}; - -dirac.afterInits = []; - -var defaults = { - host: 'localhost' -, port: 5432 -}; - -dirac.init = function(options){ - if ( typeof options === 'string' ){ - options = { connString: options }; - } - - options = options || {}; - - if ( !('connString' in options) && !('database' in options) ){ - throw new Error('Dirac.init options requires property `connString` or `database`'); - } - - dirac.options = options; - - for ( var key in defaults ){ - if ( !(key in dirac.options) ) dirac.options[ key ] = defaults[ key ]; - } - - if ( !dirac.options.connString ){ - dirac.options.connString = 'postgres://' + options.host + ':' + options.port + '/' + options.database; - } - - // Set node-pg options - for ( var key in options ){ - if ( key in pg.defaults ) pg.defaults[ key ] = options[ key ]; - } - - Object.keys( dirac.Dals ).forEach( dirac.instantiateDal ); - - // Run `use` functions - for ( var i = 0; i < dirac.afterInits.length; ++i ){ - dirac.afterInits[i]( dirac ); - } - - dirac.hasInitialized = true; -}; - -dirac.use = function( fn, options ){ - if ( typeof fn !== 'function' ){ - throw new Error('dirac.use - invalid argument type `' + typeof fn + '`'); - } - - options = options || {}; - - if ( options.immediate || fn.__immediate ) return fn( dirac ); - - if ( dirac.hasInitialized ) return fn( dirac ); - - dirac.afterInits.push( fn ); -}; - -dirac.instantiateDal = function( dal ){ - dirac.dals[ dal ] = new dirac.Dals[ dal ]( dirac.options.connString ); -}; - -dirac.destroy = function(){ - Object.keys( dirac.dals ).forEach( dirac.unregister ); - dirac.register( require('./dirac-table') ); - - dirac.afterInits = []; - delete dirac.options; - dirac.hasInitialized = false; -}; - -/** - * The premise here is to query the dirac_schemas table for previous schemas - * compare those schemas against what's loaded in memory and determine the - * required SQL to get the database in the correct state. - * with test as (select max(val) as max from vals) select vals.* from vals, test where val = test.max; - */ - -dirac.sync = function(options, callback){ - dirac.setSyncing(true); - - if ( typeof options == 'function' ){ - callback = options; - options = {}; - } - - options = options || {}; - - var oldCallback = callback || utils.noop; - callback = function(){ - dirac.setSyncing(false); - oldCallback.apply( null, arguments ); - }; - - - if ( options.force ) return (function(){ - dirac.dropAllTables( function( error ){ - if ( error && callback ) return callback( error) - if (error) console.log("ERROR: ", error); - dirac.createTables( callback ); - }); - })(); - - dirac.createTables(function(error){ - - if (error && callback) return callback(error); - if (error) console.log("ERROR: ", error); - - dirac.dals.dirac_schemas.findLatest( function(error, results){ - if (error && callback) callback(error); - if (error) console.log("ERROR: ", error); - if (results.length == 0) return dirac.saveCurrentDbState(1, callback); - - var currentDb = {}; - var newDb = {}; - var newColumn, oldColumn; - var version; - var actionResult; - var $alter = { - type: 'alter-table' - }; - - // Build current db representation - results.forEach(function(table){ - version = table.version; - currentDb[table.tableName] = table.schema; - }); - - // Build new db representation - for (var key in dirac.dals){ - if (key == 'dirac') continue; - newDb[key] = dirac.dals[key].schema; - } - - // Perform non-destructive table changes - for (var table in newDb){ - // Skip if new table since it was already taken care of - if (!(table in currentDb)) - continue; - - $alter = { - type: 'alter-table' - , table: table - }; - - // Inspect table - see if we've added new columns or attributes - for (var columnName in newDb[table]){ - newColumn = newDb[table][columnName]; - - if (!(columnName in currentDb[table])){ - $alter.action = { - addColumn: newColumn - }; - $alter.action.addColumn.name = columnName; - dirac.dals.dirac_schemas.query($alter, function(error){ - if (error) console.log("ERROR ADDING COLUMN:", error); - if (error && callback) return callback(error); - if (error) throw error; - }); - continue; - } - - $alter.action = []; - - oldColumn = currentDb[table][columnName]; - - // Inspect column attributes - utils.union( - Object.keys( newColumn ) - , Object.keys( oldColumn ) - ).forEach( function( attr ){ - if ( !atsStrats.has( attr ) ) return; - - actionResult = atsStrats.get( attr ).fn( - columnName, newColumn, oldColumn, newDb[table], currentDb[table], table - ); - - if ( Array.isArray( actionResult ) ){ - $alter.action = $alter.action.concat( actionResult ); - } else if ( typeof actionResult === 'object' && !!actionResult ) { - $alter.action.push( actionResult ); - } - }); - - // No change made, do not query - if ($alter.action.length == 0){ - continue; - } - - dirac.dals.dirac_schemas.query($alter, function(error){ - if (error) return console.log("ERROR ALTERING COLUMN", error); - }); - } - } - - dirac.saveCurrentDbState(++version, callback); - }); - }); -}; - -dirac.saveCurrentDbState = function(version, callback){ - utils.series( - Object.keys( dirac.dals ).map( function( key ){ - return function( done ){ - dirac.dals.dirac_schemas.insert({ - tableName: key - , version: version - , schema: JSON.stringify( dirac.dals[key].schema ) - }, done ); - } - }) - , callback - ); -}; - -dirac.createTables = function(callback){ - // Determine order that tables need to be created - var ordered = [], column; - - // Represent the references as a directed acyclic graph - var graph = {}; - - for (var table in dirac.dals){ - if (table in dirac.views) continue; - graph[table] = { - dependencies: [] - , incoming: [] - }; - } - - for (var table in graph){ - for (var col in dirac.dals[table].schema){ - column = dirac.dals[table].schema[col]; - - // Table does not depend on anything - if (!column.references) continue; - - graph[table].dependencies.push(column.references.table); - graph[column.references.table].incoming.push(table); - } - } - - // Get set of nodes with no edges - var notDependedOn = []; - for (var table in graph){ - if (graph[table].incoming.length == 0) notDependedOn.push(table); - } - // Perform topological sort on DAG - var table, node; - while (table = notDependedOn.pop()){ - ordered.unshift( table ); - - // Table has no dependencies, so it doesn't matter where it is - if (graph[table].dependencies.length == 0) continue; - - // Remove edges from table to dependencies - while ( graph[table].dependencies.length > 0 ){ - node = graph[table].dependencies.pop(); - graph[node].incoming = graph[node].incoming.filter(function(t){ - return t != table; - }); - - if ( graph[node].incoming.length == 0 ){ - notDependedOn.push( node ); - } - } - } - - // Ensure solution was found - if ( Object.keys( graph ).filter( function( table ){ - return graph[ table ].incoming.length > 0 || graph[ table ].dependencies.length > 0; - }).length > 0 ) return callback( new Error( 'Dependency tree is cyclic' )); - - utils.series( - ordered.map( function( table ){ - return function( done ){ - dirac.dals[table].createIfNotExists( done ); - } - }) - , function( error ){ - if ( error ) return callback( error ); - - dirac.createViews( callback ); - } - ); -}; - -dirac.createViews = function( callback ){ - utils.series( - Object.keys( dirac.views ).map( function( viewName ){ - return function( done ){ - var result = mosql.sql({ - type: 'create-view' - , view: viewName - , orReplace: true - , expression: dirac.views[ viewName ]._query - }); - - var query = result.toString(); - - // Node-PG is doing funky stuff with paramaterized - // create view queries. So de-paramaterize - result.values.forEach( function( r, i ){ - query = query.replace( new RegExp( '\\$' + (i + 1), 'g' ), '$$$' + r + '$$$' ); - }); - dirac.raw( query, done ); - }; - }) - , function( error ){ - return callback( error ); - } - ); -}; - -dirac.dropAllTables = function(options, callback){ - if ( typeof options == 'function' ){ - callback = options; - options = {}; - } - - options = options || {}; - - utils.series( - Object.keys( dirac.dals ).map( function( table ){ - return function( done ){ - if ( table == 'dirac_schemas' && !options.forceDirac ) return done(); - dirac.dals[table].dropTable({ cascade: true }, done); - } - }) - , callback - ); -}; - -dirac.register = function(definition){ - if ( !('name' in definition) ) - throw new Error('Dirac.register `name` required in definition'); - - if ( !('schema' in definition) && definition.type != 'view' ) - throw new Error('Dirac.register `schema` required in definition'); - - if ( definition.type == 'view' && typeof definition.query != 'object' ) - throw new Error('Dirac.register `query` required when `type` is `view`'); - - var name = definition.alias || definition.name; - var view; - - if ( definition.type == 'view' ){ - view = dirac.views[ name ] = utils.clone( definition ); - - // Do not override DAL.query - view._query = view.query; - delete view.query; - } - - var DAL = dirac.DAL.extend( definition.type == 'view' ? view : definition ); - - dirac.Dals[ name ] = DAL; - - if ( dirac.hasInitialized ) dirac.instantiateDal( name ); - - return DAL; -}; - -dirac.unregister = function(name){ - delete dirac.dals[ name ]; - delete dirac.Dals[ name ]; - delete dirac.views[ name ]; -}; - -dirac.query = function( $query, callback ){ - var query = mosql.sql( $query ); - dirac.raw( query.toString(), query.values, callback ); -}; - -dirac.raw = function( query, values, callback ){ - if ( typeof values == 'function' ){ - callback = values; - values = []; - } - - callback = callback || utils.noop; - - pg.connect( dirac.options.connString, function( error, client, done ){ - if (error) return callback( error ); - client.query( query, values, function( error, result ){ - done(); - - callback( error, result ); - }); - }); -}; - -dirac.setSyncing = function( val ){ - dirac.isSyncing = !!val; - - for ( var key in dirac.dals ){ - if ( key === 'dirac_schemas' ) continue; - dirac.dals[ key ].isSyncing = dirac.isSyncing; - } - - if ( dirac.isSyncing ) return; - - // Sync complete, let's drain queues - for ( var key in dirac.dals ){ - if ( key === 'dirac_schemas' ) continue; - if ( dirac.dals[ key ].syncQueue.length === 0 ) continue; - dirac.dals[ key ].syncQueue.forEach( function( fn ){ - fn(); - }); - } -}; - -dirac.tx = { - create: function() { - var client = new pg.Client( dirac.options.connString ); - client.connect(); - - var throwError = function() { - throw new Error('Transaction already committed'); - }; - - var releaseClient = function() { - client.end(); - client = null; - Object.keys( dirac.Dals ).forEach(function(tbl) { - tx[tbl].client = null; - }); - }; - - var tx = { - begin: function( callback ){ - if ( !client ) throwError(); - client.query( 'BEGIN', callback ); - } - , rollback: function( callback ){ - if ( !client ) throwError(); - client.query( 'ROLLBACK', function( err ){ - if (!err) releaseClient(); - callback(err); - }); - } - , commit: function( callback ){ - if ( !client ) throwError(); - client.query( 'COMMIT', function( err ){ - if (!err) releaseClient(); - callback(err); - }); - } - }; - - // Attach DALs to the tx client - Object.keys( dirac.Dals ).forEach( function( tbl ){ - var Dal = dirac.Dals[ tbl ]; - tx[ tbl ] = new Dal( client ); - tx[ tbl ].befores = dirac.dals[ tbl ].befores; - tx[ tbl ].afters = dirac.dals[ tbl ].afters; - }); - - for ( var key in tx ){ - client[ key ] = tx[ key ]; - } - - return client; - } -}; - -module.exports = dirac = Object.create( dirac ); -EventEmitter.call( dirac ); - -dirac.register( require('./dirac-table') ); diff --git a/lib/errors.js b/lib/errors.js new file mode 100644 index 0000000..1ec62bc --- /dev/null +++ b/lib/errors.js @@ -0,0 +1,5 @@ +module.exports.MissingPool = class MissingPoolError extends Error { + constructor(){ + super('Must supply an instance of Pool'); + } +} \ No newline at end of file diff --git a/lib/immutable.js b/lib/immutable.js new file mode 100644 index 0000000..87dc753 --- /dev/null +++ b/lib/immutable.js @@ -0,0 +1,74 @@ +/** + * @class Immutable + * Creates an object with immutable helper methods. It is + * expected that implementors of an Immutable class will + * implement their own `clone()` method as this is usually + * specific to each object type + */ +class Immutable { + /** + * Create an instance of @Immutable + * @param {Object} options Options passed to instance + * @return {Immutable} + */ + static create( options ){ + return new this( options ); + } + + /** + * Create an instance of @Immutable + * Assumes that you are wanting to create an object with some + * options and to assign those options to the instance + * + * Relevant Options: + * { + * // Whether or not this instance should behave immutably + * immutable: true|false + * } + * + * @param {Object} options Options passed to instance + * @return {Immutable} + */ + constructor( options = {} ){ + this.options = options; + this.options.immutable = typeof this.options.immutable === 'boolean' + ? this.options.immutable : true; + } + + /** + * Gets the current instance. If immutable is on, this returns + * a clone. Otherwise, it returns `this` + * @return {Immutable} This or a new instance + */ + instance(){ + return this.options.immutable ? this.clone() : this; + } + + /** + * Clone the current instance. It is expected that consumers + * of this class override this method + * @return {Immutable} + */ + clone(){ + var options = Object.assign( {}, this.options ); + return Immutable.create( options ); + } + + /** + * Temporarily disables immutability so any changes happen + * to `this` rather than a new instance + * @param {Function} handler(this) The function that mutates + * @return {Immutable} This instance + */ + mutate( handler ){ + var prev = this.options.immutable; + + this.options.immutable = false; + handler( this ); + this.options.immutable = prev; + + return this; + } +} + +module.exports = Immutable; \ No newline at end of file diff --git a/lib/query-base.js b/lib/query-base.js new file mode 100644 index 0000000..ad3c338 --- /dev/null +++ b/lib/query-base.js @@ -0,0 +1,107 @@ +/** + * Maybe rename to this to Transformable since it mainly handles + * query transforms + */ + +const _ = require('lodash'); +const mosql = require('mongo-sql'); +const Errors = require('./errors'); +const Immutable = require('./immutable'); +const ClientOptions = require('./client-options'); +const QueryTransform = require('./query-transform'); +const ResultTransform = require('./result-transform'); + +class QueryBase extends Immutable { + static create( options ){ + return new this( options ); + } + + constructor( options = {} ){ + super( options ); + + this.options = options; + this.pool = options.pool; + this.mosql = options.mosql || mosql; + this.queryTransforms = options.queryTransforms || []; + this.resultsTransforms = options.resultsTransforms || []; + this.clientOptions = options.clientOptions || new ClientOptions( options ); + } + + clone(){ + var options = Object.assign( {}, this.options ); + options.queryTransforms = this.queryTransforms.slice(0); + options.resultsTransforms = this.resultsTransforms.slice(0); + + return QueryBase.create( options ); + } + + use( middleware ){ + if ( middleware instanceof QueryTransform ){ + return this.before( middleware ); + } else if ( middleware instanceof ResultTransform ){ + return this.after( middleware ); + } else if ( typeof middleware === 'function' ){ + return middleware( this ); + } + + throw new InvalidTransformError(); + } + + before( transform ){ + // If providing an array of transforms, do not clone query for each + if ( Array.isArray( transform ) ){ + return this.instance().mutate( query => { + transform.forEach( t => query.before( t ) ); + }); + } + + if ( typeof transform === 'function' ){ + transform = QueryTransform.create( transform ); + } + + if ( !(transform instanceof QueryTransform) ){ + throw new InvalidTransformError('QueryTransform'); + } + + var this_ = this.instance(); + this_.queryTransforms.push( transform ); + + return this_; + } + + after( transform ){ + // If providing an array of transforms, do not clone query for each + if ( Array.isArray( transform ) ){ + return this.instance().mutate( query => { + transform.forEach( t => query.after( t ) ); + }); + } + + if ( typeof transform === 'function' ){ + transform = ResultTransform.create( transform ); + } + + if ( !(transform instanceof ResultTransform) ){ + throw new InvalidTransformError('ResultTransform'); + } + + var this_ = this.instance(); + this_.resultsTransforms.push( transform ); + + return this_; + } +} + +class InvalidTransformError extends Error { + constructor( type ){ + if ( !type ){ + super('Invalid Transform type'); + } else { + super(`Transform must be either a "Function" or "${type}"`); + } + } +} + +QueryBase.InvalidTransformError = InvalidTransformError; + +module.exports = QueryBase; \ No newline at end of file diff --git a/lib/query-debugger.js b/lib/query-debugger.js new file mode 100644 index 0000000..b4d07dd --- /dev/null +++ b/lib/query-debugger.js @@ -0,0 +1,110 @@ +const chalk = require('chalk'); +const sqlformatter = require('sqlformatter').format; + +class QueryDebugger { + static getTimeColorizer( seconds ){ + if ( seconds < 0.2 ) return chalk.green; + if ( seconds < 0.5 ) return chalk.yellow; + return chalk.red; + } + + constructor( query, values, original ){ + this.query = query; + this.values = values; + this.original = original; + this.begin = new Date(); + } + + log(){ + console.log( this.highlight() ); + } + + end(error){ + var seconds = (new Date() - this.begin) / 1000; + var colorizer = QueryDebugger.getTimeColorizer( seconds ); + var secondsStr = colorizer( seconds + 's' ) + + if (error) { + console.log(error); + console.log('Errored after', secondsStr); + } else { + console.log('Finished executing in', secondsStr); + } + console.log(''); + } + + highlight(){ + var ordering = QueryDebugger.highlightOrdering; + + return sqlformatter( this.query ) + .replace( QueryDebugger.regex.value, match => { + var value = this.values[ +match.substring(1) - 1 ]; + + if ( typeof value === 'number' ){ + return value + } + + if ( typeof value === 'string' ){ + return `'${value}'` + } + + if ( value instanceof Date ){ + return `'${value.toISOString()}'` + } + + return `'${JSON.stringify(value)}'`; + }) + .split('\n') + .map( line => { + return line.split(/\s/g).map( token => { + var highlighter; + + for ( var i = ordering.length - 1, regex; i >= 0; i-- ){ + regex = QueryDebugger.regex[ ordering[ i ] ]; + + if ( regex.test( token ) ){ + highlighter = QueryDebugger.style[ ordering[ i ] ]; + break; + } + } + + if ( highlighter ){ + return highlighter( token ); + } + + return token; + }) + .join(' '); + }) + .join('\n') + } +} + +QueryDebugger.highlightOrdering = [ + 'keyword' +, 'number' +, 'primitive' +, 'value' +, 'quotes' +, 'string' +]; + +QueryDebugger.regex = { + keyword: new RegExp( require('./sql-keywords').join('|'), 'ig' ) +, quotes: /("[^"]*")/g +, string: /('[^']*')/g +, number: /-?\d+(?:\.\d+)?(?:e-?\d+)?/g +, primitive: /true|false|null/g +, value: /\$\d+/g +}; + +QueryDebugger.style = { + keyword: token => chalk.bold.white( token ) +, quotes: token => chalk.cyan( token ) +, string: token => chalk.yellow( token ) +, number: token => chalk.red( token ) +, primitive: token => chalk.red( token ) +, value: token => chalk.blue( token ) +}; + +module.exports = QueryDebugger; \ No newline at end of file diff --git a/lib/query-generator.js b/lib/query-generator.js new file mode 100644 index 0000000..0bcf05f --- /dev/null +++ b/lib/query-generator.js @@ -0,0 +1,52 @@ +const _ = require('lodash'); +const mosql = require('mongo-sql'); +const QueryBase = require('./query-base'); +const Query = require('./query'); + +class QueryGenerator extends QueryBase { + constructor( options = {} ){ + super( options ); + this.Query = options.Query || Query; + } + + query( query, customOptions ){ + var options = this.getCreateQueryOptions(); + + if ( typeof customOptions === 'object' ){ + Object.assign( options, customOptions ); + } + + return this.Query.create( query, options ); + } + + stream( query, customOptions ){ + var options = this.getCreateQueryOptions(); + + if ( typeof customOptions === 'object' ){ + Object.assign( options, customOptions ); + } + + return new QueryStream( query, options ); + } + + clone(){ + var options = Object.assign( {}, this.options ); + options.queryTransforms = this.queryTransforms.slice(0); + options.resultsTransforms = this.resultsTransforms.slice(0); + + return QueryGenerator.create( options ); + } + + getCreateQueryOptions(){ + return { + pool: this.pool + , mosql: this.mosql + , queryTransforms: this.queryTransforms.slice(0) + , resultsTransforms: this.resultsTransforms.slice(0) + , Query: this.Query + , debug: this.options.debug + }; + } +} + +module.exports = QueryGenerator; \ No newline at end of file diff --git a/lib/query-transform.js b/lib/query-transform.js new file mode 100644 index 0000000..7c91129 --- /dev/null +++ b/lib/query-transform.js @@ -0,0 +1,15 @@ +class QueryTransform { + static create( handler ){ + return new this( handler ); + } + + constructor( handler ){ + this.handler = handler; + } + + execute( query ){ + return this.handler( query ); + } +} + +module.exports = QueryTransform; \ No newline at end of file diff --git a/lib/query.js b/lib/query.js new file mode 100644 index 0000000..1bdd609 --- /dev/null +++ b/lib/query.js @@ -0,0 +1,347 @@ +var _ = require('lodash'); +var mosql = require('mongo-sql'); +var PGPool = require('pg-pool'); +var PGQueryStream = require('pg-query-stream'); +var QueryTransform = require('./query-transform'); +var RawQueryExecutor = require('./raw-query-executor'); +var QueryBase = require('./query-base'); +var QueryDebugger = require('./query-debugger'); +var Errors = require('./errors'); + +class Query extends QueryBase { + static create( query, options ){ + return new this( query, options ); + } + + static getUpsertUpdateFromObject( values ){ + return Object + .keys( Array.isArray( values ) ? values[0] : values ) + .reduce( (result, k) => { + if ( typeof values[ k ] === 'string' ){ + result[ k ] = `$excluded.${this.mosql.quoteObject(k)}`; + } else { + result[ k ] = values[ k ]; + } + + return result; + }, {}); + } + + static getDedupedQueryObj(mobj) { + var dedupedValues = mobj.values + .filter((v, i, values) => { + return values.indexOf(v, i + 1) === -1 + }, []) + + var sourceMap = mobj.values + .reduce((obj, v, i) => { + obj[i] = dedupedValues.indexOf(v) + return obj + }, {}) + + var queryStr = mobj.query + .replace(QueryDebugger.regex.value, match => { + const targetIdx = +match.substring(1) - 1 + const resultIdx = sourceMap[targetIdx] + const wasChanged = resultIdx !== undefined && resultIdx !== -1 + return `$${(wasChanged ? resultIdx : targetIdx) + 1}` + }) + + return { + query: queryStr, + values: dedupedValues, + original: mobj.original, + toString: () => queryStr, + toQuery: () => { + return { text: queryStr, values: dedupedValues } + }, + } + } + + constructor( query = {}, options = {} ){ + super( options ); + this.mosqlQuery = query; + } + + execute(){ + if ( !this.pool ){ + throw new Errors.MissingPool(); + } + + var query; + + try { + query = this.toStringAndValues() + } catch ( e ){ + return Promise.reject( e ); + } + + var qdebugger; + + // Instantiate qdebugger here so we get the beginning time + if ( this.options.debug ){ + qdebugger = new QueryDebugger( query.toString(), query.values ); + } + + return this.pool + .query( query.toString(), query.values ) + .catch( e => { + // If there was an error, we still want to log + if ( this.options.debug ){ + qdebugger.log(); + qdebugger.end(e); + } + + // But we need to re-throw to force the consumer to handle + throw e; + }) + .then( result => { + if ( this.options.debug ){ + qdebugger.log(); + qdebugger.end(); + } + + return result; + }) + .then( result => this.getTransformedResult( result.rows ) ) + } + + stream(){ + var query; + var qdebugger; + + try { + query = this.toStringAndValues() + + // Instantiate qdebugger here so we get the beginning time + if ( this.options.debug ){ + qdebugger = new QueryDebugger( query.toString(), query.values ); + } + + query = new PGQueryStream( query.toString(), query.values ); + } catch ( e ){ + return Promise.reject( e ); + } + + return this.pool.connect() + .then( client => { + var stream = client.query( query ); + if ( this.options.debug ){ + stream.once( 'data', ()=> { + qdebugger.log(); + qdebugger.end(); + }); + } + stream.once( 'end', ()=> client.release() ); + return stream; + }); + } + + clone(){ + var options = Object.assign( {}, this.options ); + + options.queryTransforms = this.queryTransforms.slice(0); + options.resultsTransforms = this.resultsTransforms.slice(0); + + return this.constructor.create( _.cloneDeep( this.mosqlQuery ), options ); + } + + where( key, val ){ + if ( key === undefined ){ + return this.mosqlQuery.where; + } + + if ( typeof key === 'string' && val === undefined ){ + return _.get( this.mosqlQuery.where, key ); + } + + var query = this.instance(); + + // If the underlying query did not already have a `where` clause + if ( !query.mosqlQuery.where ){ + query.mosqlQuery.where = {}; + } + + if ( typeof key === 'string' ){ + _.set( query.mosqlQuery.where, key, val ); + } else { + Object.assign( query.mosqlQuery.where, key ); + } + + return query; + } + + columns( column ){ + if ( !column ){ + return this.mosqlQuery.columns; + } + + var query = this.instance(); + + if ( !Array.isArray( query.mosqlQuery.columns ) ){ + query.mosqlQuery.columns = []; + } + + if (Array.isArray(column)) { + query.mosqlQuery.columns = column; + } else { + query.mosqlQuery.columns.push( column ); + } + + return query; + } + + groupBy( value ){ + if ( !value ){ + return this.mosqlQuery.groupBy; + } + + var query = this.instance(); + + if ( !Array.isArray( query.mosqlQuery.groupBy ) ){ + query.mosqlQuery.groupBy = []; + } + + if (Array.isArray(value)) { + query.mosqlQuery.groupBy = value; + } else { + query.mosqlQuery.groupBy.push( value ); + } + + return query; + } + + returning( column ){ + if ( !column ){ + return this.mosqlQuery.returning; + } + + var query = this.instance(); + + if ( !Array.isArray( query.mosqlQuery.returning ) ){ + query.mosqlQuery.returning = []; + } + + query.mosqlQuery.returning.push( column ); + + return query; + } + + joins( join ){ + if ( !join ){ + return this.mosqlQuery.joins; + } + + var query = this.instance(); + + if ( !Array.isArray( query.mosqlQuery.joins ) ){ + query.mosqlQuery.joins = []; + } + + if ( !join.type ){ + join.type = 'left'; + } + + query.mosqlQuery.joins.push( join ); + + return query; + } + + with( withQuery ){ + if ( !withQuery ){ + return this.mosqlQuery.with; + } + + if ( withQuery instanceof Query ){ + withQuery = withQuery.mosqlQuery; + } + + var query = this.instance(); + + if ( !Array.isArray( query.mosqlQuery.with ) ){ + query.mosqlQuery.with = []; + } + + query.mosqlQuery.with.push( withQuery ); + + return query; + } + + values( values ){ + if ( values === undefined ){ + return this.mosqlQuery.values; + } + + var query = this.instance(); + query.mosqlQuery.values = values; + + // If conflict specified the same values as the insert + // then go ahead and update the conflict resolution + // if ( typeof query.conflict === 'object' && [null, undefined].indexOf( query.conflict ) === -1 ) + // if ( _.get( this.mosqlQuery, 'conflict.action.update') === this.mosqlQuery.values ){ + // query.mosqlQuery.conflict.action.update = values; + // } + + return query; + } + + getTransformedResult( rows ){ + return this.resultsTransforms.reduce( ( result, transform )=>{ + return transform.execute( result, this ); + }, rows ); + } + + getTransformedQuery(){ + if ( this.queryTransforms.length === 0 ){ + return this; + } + + return this.queryTransforms + .reduce( + ( query, transform ) => transform.handler( query ) + , this.clone().mutate(query => query.queryTransforms = []) + ) + + return this.clone().mutate( query => { + this.queryTransforms.forEach( transform => transform.handler( query ) ); + }); + } + + toStringAndValues(){ + if ( !this.mosql ){ + throw new Error('Cannot serialize a query without a MongoSQL instance'); + } + + return Query.getDedupedQueryObj( + this.mosql.sql( this.getTransformedQuery().mosqlQuery ) + ); + } +} + +Query.standardGettersAndSetters = [ + 'type', 'table', 'action', 'alias', 'cascade' +, 'definition', 'distinct', 'expression', 'for' +, 'from', 'having', 'limit', 'offset' +, 'only', 'order', 'over', 'partition', 'queries' +, 'updates', 'view', 'conflict', 'name', +]; + +Query.standardGettersAndSetters.forEach( key => { + Query.prototype[ key ] = function( value ){ + if ( value === undefined ){ + return this.mosqlQuery[ key ]; + } + + var query = this.instance(); + query.mosqlQuery[ key ] = value; + + return query; + }; +}); + + +['raw', 'getClient'].forEach( k => { + Query.prototype[ k ] = RawQueryExecutor.prototype[ k ]; +}); + +module.exports = Query; \ No newline at end of file diff --git a/lib/raw-query-executor.js b/lib/raw-query-executor.js new file mode 100644 index 0000000..9f57a15 --- /dev/null +++ b/lib/raw-query-executor.js @@ -0,0 +1,26 @@ +const pg = require('pg'); +const PGPool = require('pg-pool'); + +class RawQueryExecutor { + constructor( options = {} ){ + this.pool = options.pool || new PGPool( options ); + } + + getClient(){ + return this.pool.connect(); + } + + raw( query, values ){ + return this.pool.query( query, values ); + } +} + +class MissingConnectionStringError extends Error { + constructor( method ){ + super(`Cannot call "${method}" without "options.connectionString"`); + } +} + +RawQueryExecutor.MissingConnectionStringError = MissingConnectionStringError; + +module.exports = RawQueryExecutor; \ No newline at end of file diff --git a/lib/result-transform.js b/lib/result-transform.js new file mode 100644 index 0000000..0d99e90 --- /dev/null +++ b/lib/result-transform.js @@ -0,0 +1,19 @@ +class ResultTransform { + static create( handler ){ + return new ResultTransform( handler ); + } + + constructor( handler ){ + this.handler = handler; + } + + execute( result, query ){ + return this.handler( result, query ); + } +} + +ResultTransform.firstRow = ResultTransform.create( ( rows, query )=>{ + return rows[0]; +}); + +module.exports = ResultTransform; \ No newline at end of file diff --git a/lib/sql-keywords.js b/lib/sql-keywords.js new file mode 100644 index 0000000..88e1ca0 --- /dev/null +++ b/lib/sql-keywords.js @@ -0,0 +1,113 @@ +module.exports = [ + "A", "ABORT", "ABS", "ABSENT", "ABSOLUTE", "ACCESS", "ACCORDING", "ACTION", + "ADA", "ADD", "ADMIN", "AFTER", "AGGREGATE", "ALL", "ALLOCATE", "ALSO", + "ALTER", "ALWAYS", "ANALYSE", "ANALYZE", "AND", "ANY", "ARE", "ARRAY", + "ARRAY_AGG", "ARRAY_MAX_CARDINALITY", "AS", "ASC", "ASENSITIVE", "ASSERTION", + "ASSIGNMENT", "ASYMMETRIC", "AT", "ATOMIC", "ATTRIBUTE", "ATTRIBUTES", + "AUTHORIZATION", "AVG", "BACKWARD", "BASE64", "BEFORE", "BEGIN", "BEGIN_FRAME", + "BEGIN_PARTITION", "BERNOULLI", "BETWEEN", "BIGINT", "BINARY", "BIT", "BIT_LENGTH", + "BLOB", "BLOCKED", "BOM", "BOOLEAN", "BOTH", "BREADTH", "BY", "C", "CACHE", + "CALL", "CALLED", "CARDINALITY", "CASCADE", "CASCADED", "CASE", "CAST", "CATALOG", + "CATALOG_NAME", "CEIL", "CEILING", "CHAIN", "CHAR", "CHARACTER", "CHARACTERISTICS", + "CHARACTERS", "CHARACTER_LENGTH", "CHARACTER_SET_CATALOG", "CHARACTER_SET_NAME", + "CHARACTER_SET_SCHEMA", "CHAR_LENGTH", "CHECK", "CHECKPOINT", "CLASS", + "CLASS_ORIGIN", "CLOB", "CLOSE", "CLUSTER", "COALESCE", "COBOL", "COLLATE", + "COLLATION", "COLLATION_CATALOG", "COLLATION_NAME", "COLLATION_SCHEMA", "COLLECT", + "COLUMN", "COLUMNS", "COLUMN_NAME", "COMMAND_FUNCTION", "COMMAND_FUNCTION_CODE", + "COMMENT", "COMMENTS", "COMMIT", "COMMITTED", "CONCURRENTLY", "CONDITION", + "CONDITION_NUMBER", "CONFIGURATION", "CONFLICT", "CONNECT", "CONNECTION", + "CONNECTION_NAME", "CONSTRAINT", "CONSTRAINTS", "CONSTRAINT_CATALOG", + "CONSTRAINT_NAME", "CONSTRAINT_SCHEMA", "CONSTRUCTOR", "CONTAINS", "CONTENT", + "CONTINUE", "CONTROL", "CONVERSION", "CONVERT", "COPY", "CORR", "CORRESPONDING", + "COST", "COUNT", "COVAR_POP", "COVAR_SAMP", "CREATE", "CROSS", "CSV", "CUBE", + "CUME_DIST", "CURRENT", "CURRENT_CATALOG", "CURRENT_DATE", + "CURRENT_DEFAULT_TRANSFORM_GROUP", "CURRENT_PATH", "CURRENT_ROLE", "CURRENT_ROW", + "CURRENT_SCHEMA", "CURRENT_TIME", "CURRENT_TIMESTAMP", + "CURRENT_TRANSFORM_GROUP_FOR_TYPE", "CURRENT_USER", "CURSOR", "CURSOR_NAME", + "CYCLE", "DATA", "DATABASE", "DATALINK", "DATE", "DATETIME_INTERVAL_CODE", + "DATETIME_INTERVAL_PRECISION", "DAY", "DB", "DEALLOCATE", "DEC", "DECIMAL", + "DECLARE", "DEFAULT", "DEFAULTS", "DEFERRABLE", "DEFERRED", "DEFINED", "DEFINER", + "DEGREE", "DELETE", "DELIMITER", "DELIMITERS", "DENSE_RANK", "DEPTH", "DEREF", + "DERIVED", "DESC", "DESCRIBE", "DESCRIPTOR", "DETERMINISTIC", "DIAGNOSTICS", + "DICTIONARY", "DISABLE", "DISCARD", "DISCONNECT", "DISPATCH", "DISTINCT", + "DLNEWCOPY", "DLPREVIOUSCOPY", "DLURLCOMPLETE", "DLURLCOMPLETEONLY", + "DLURLCOMPLETEWRITE", "DLURLPATH", "DLURLPATHONLY", "DLURLPATHWRITE", + "DLURLSCHEME", "DLURLSERVER", "DLVALUE", "DO", "DOCUMENT", "DOMAIN", "DOUBLE", + "DROP", "DYNAMIC", "DYNAMIC_FUNCTION", "DYNAMIC_FUNCTION_CODE", "EACH", + "ELEMENT", "ELSE", "EMPTY", "ENABLE", "ENCODING", "ENCRYPTED", "END", "END-EXEC", + "END_FRAME", "END_PARTITION", "ENFORCED", "ENUM", "EQUALS", "ESCAPE", "EVENT", + "EVERY", "EXCEPT", "EXCEPTION", "EXCLUDE", "EXCLUDING", "EXCLUSIVE", "EXEC", + "EXECUTE", "EXISTS", "EXP", "EXPLAIN", "EXPRESSION", "EXTENSION", "EXTERNAL", + "EXTRACT", "FALSE", "FAMILY", "FETCH", "FILE", "FILTER", "FINAL", "FIRST", + "FIRST_VALUE", "FLAG", "FLOAT", "FLOOR", "FOLLOWING", "FOR", "FORCE", "FOREIGN", + "FORTRAN", "FORWARD", "FOUND", "FRAME_ROW", "FREE", "FREEZE", "FROM", "FS", + "FULL", "FUNCTION", "FUNCTIONS", "FUSION", "G", "GENERAL", "GENERATED", "GET", + "GLOBAL", "GO", "GOTO", "GRANT", "GRANTED", "GREATEST", "GROUP", "GROUPING", + "GROUPS", "HANDLER", "HAVING", "HEADER", "HEX", "HIERARCHY", "HOLD", "HOUR", + "ID", "IDENTITY", "IF", "IGNORE", "ILIKE", "IMMEDIATE", "IMMEDIATELY", "IMMUTABLE", + "IMPLEMENTATION", "IMPLICIT", "IMPORT", "IN", "INCLUDING", "INCREMENT", + "INDENT", "INDEX", "INDEXES", "INDICATOR", "INHERIT", "INHERITS", "INITIALLY", + "INLINE", "INNER", "INOUT", "INPUT", "INSENSITIVE", "INSERT", "INSTANCE", + "INSTANTIABLE", "INSTEAD", "INT", "INTEGER", "INTEGRITY", "INTERSECT", "INTERSECTION", + "INTERVAL", "INTO", "INVOKER", "IS", "ISNULL", "ISOLATION", "JOIN", "K", + "KEY", "KEY_MEMBER", "KEY_TYPE", "LABEL", "LAG", "LANGUAGE", "LARGE", "LAST", + "LAST_VALUE", "LATERAL", "LEAD", "LEADING", "LEAKPROOF", "LEAST", "LEFT", + "LENGTH", "LEVEL", "LIBRARY", "LIKE", "LIKE_REGEX", "LIMIT", "LINK", "LISTEN", + "LN", "LOAD", "LOCAL", "LOCALTIME", "LOCALTIMESTAMP", "LOCATION", "LOCATOR", + "LOCK", "LOCKED", "LOGGED", "LOWER", "M", "MAP", "MAPPING", "MATCH", "MATCHED", + "MATERIALIZED", "MAX", "MAXVALUE", "MAX_CARDINALITY", "MEMBER", "MERGE", "MESSAGE_LENGTH", + "MESSAGE_OCTET_LENGTH", "MESSAGE_TEXT", "METHOD", "MIN", "MINUTE", "MINVALUE", + "MOD", "MODE", "MODIFIES", "MODULE", "MONTH", "MORE", "MOVE", "MULTISET", + "MUMPS", "NAME", "NAMES", "NAMESPACE", "NATIONAL", "NATURAL", "NCHAR", + "NCLOB", "NESTING", "NEW", "NEXT", "NFC", "NFD", "NFKC", "NFKD", "NIL", "NO", + "NONE", "NORMALIZE", "NORMALIZED", "NOT", "NOTHING", "NOTIFY", "NOTNULL", + "NOWAIT", "NTH_VALUE", "NTILE", "NULL", "NULLABLE", "NULLIF", "NULLS", "NUMBER", + "NUMERIC", "OBJECT", "OCCURRENCES_REGEX", "OCTETS", "OCTET_LENGTH", "OF", "OFF", + "OFFSET", "OIDS", "OLD", "ON", "ONLY", "OPEN", "OPERATOR", "OPTION", "OPTIONS", + "OR", "ORDER", "ORDERING", "ORDINALITY", "OTHERS", "OUT", "OUTER", "OUTPUT", + "OVER", "OVERLAPS", "OVERLAY", "OVERRIDING", "OWNED", "OWNER", "P", "PAD", + "PARAMETER", "PARAMETER_MODE", "PARAMETER_NAME", "PARAMETER_ORDINAL_POSITION", + "PARAMETER_SPECIFIC_CATALOG", "PARAMETER_SPECIFIC_NAME", "PARAMETER_SPECIFIC_SCHEMA", + "PARSER", "PARTIAL", "PARTITION", "PASCAL", "PASSING", "PASSTHROUGH", "PASSWORD", + "PATH", "PERCENT", "PERCENTILE_CONT", "PERCENTILE_DISC", "PERCENT_RANK", "PERIOD", + "PERMISSION", "PLACING", "PLANS", "PLI", "POLICY", "PORTION", "POSITION", "POSITION_REGEX", + "POWER", "PRECEDES", "PRECEDING", "PRECISION", "PREPARE", "PREPARED", "PRESERVE", + "PRIMARY", "PRIOR", "PRIVILEGES", "PROCEDURAL", "PROCEDURE", "PROGRAM", "PUBLIC", + "QUOTE", "RANGE", "RANK", "READ", "READS", "REAL", "REASSIGN", "RECHECK", "RECOVERY", + "RECURSIVE", "REF", "REFERENCES", "REFERENCING", "REFRESH", "REGR_AVGX", "REGR_AVGY", + "REGR_COUNT", "REGR_INTERCEPT", "REGR_R2", "REGR_SLOPE", "REGR_SXX", "REGR_SXY", + "REGR_SYY", "REINDEX", "RELATIVE", "RELEASE", "RENAME", "REPEATABLE", "REPLACE", + "REPLICA", "REQUIRING", "RESET", "RESPECT", "RESTART", "RESTORE", "RESTRICT", + "RESULT", "RETURN", "RETURNED_CARDINALITY", "RETURNED_LENGTH", "RETURNED_OCTET_LENGTH", + "RETURNED_SQLSTATE", "RETURNING", "RETURNS", "REVOKE", "RIGHT", "ROLE", "ROLLBACK", + "ROLLUP", "ROUTINE", "ROUTINE_CATALOG", "ROUTINE_NAME", "ROUTINE_SCHEMA", "ROW", + "ROWS", "ROW_COUNT", "ROW_NUMBER", "RULE", "SAVEPOINT", "SCALE", "SCHEMA", "SCHEMA_NAME", + "SCOPE", "SCOPE_CATALOG", "SCOPE_NAME", "SCOPE_SCHEMA", "SCROLL", "SEARCH", "SECOND", + "SECTION", "SECURITY", "SELECT", "SELECTIVE", "SELF", "SENSITIVE", "SEQUENCE", "SEQUENCES", + "SERIALIZABLE", "SERVER", "SERVER_NAME", "SESSION", "SESSION_USER", "SET", "SETOF", + "SETS", "SHARE", "SHOW", "SIMILAR", "SIMPLE", "SIZE", "SKIP", "SMALLINT", "SNAPSHOT", + "SOME", "SOURCE", "SPACE", "SPECIFIC", "SPECIFICTYPE", "SPECIFIC_NAME", "SQL", "SQLCODE", + "SQLERROR", "SQLEXCEPTION", "SQLSTATE", "SQLWARNING", "SQRT", "STABLE", "STANDALONE", + "START", "STATE", "STATEMENT", "STATIC", "STATISTICS", "STDDEV_POP", "STDDEV_SAMP", + "STDIN", "STDOUT", "STORAGE", "STRICT", "STRIP", "STRUCTURE", "STYLE", "SUBCLASS_ORIGIN", + "SUBMULTISET", "SUBSTRING", "SUBSTRING_REGEX", "SUCCEEDS", "SUM", "SYMMETRIC", "SYSID", + "SYSTEM", "SYSTEM_TIME", "SYSTEM_USER", "T", "TABLE", "TABLES", "TABLESAMPLE", + "TABLESPACE", "TABLE_NAME", "TEMP", "TEMPLATE", "TEMPORARY", "TEXT", "THEN", "TIES", + "TIME", "TIMESTAMP", "TIMEZONE_HOUR", "TIMEZONE_MINUTE", "TO", "TOKEN", "TOP_LEVEL_COUNT", + "TRAILING", "TRANSACTION", "TRANSACTIONS_COMMITTED", "TRANSACTIONS_ROLLED_BACK", + "TRANSACTION_ACTIVE", "TRANSFORM", "TRANSFORMS", "TRANSLATE", "TRANSLATE_REGEX", + "TRANSLATION", "TREAT", "TRIGGER", "TRIGGER_CATALOG", "TRIGGER_NAME", "TRIGGER_SCHEMA", + "TRIM", "TRIM_ARRAY", "TRUE", "TRUNCATE", "TRUSTED", "TYPE", "TYPES", "UESCAPE", "UNBOUNDED", + "UNCOMMITTED", "UNDER", "UNENCRYPTED", "UNION", "UNIQUE", "UNKNOWN", "UNLINK", + "UNLISTEN", "UNLOGGED", "UNNAMED", "UNNEST", "UNTIL", "UNTYPED", "UPDATE", + "UPPER", "URI", "USAGE", "USER", "USER_DEFINED_TYPE_CATALOG", "USER_DEFINED_TYPE_CODE", + "USER_DEFINED_TYPE_NAME", "USER_DEFINED_TYPE_SCHEMA", "USING", "VACUUM", "VALID", + "VALIDATE", "VALIDATOR", "VALUE", "VALUES", "VALUE_OF", "VARBINARY", "VARCHAR", + "VARIADIC", "VARYING", "VAR_POP", "VAR_SAMP", "VERBOSE", "VERSION", "VERSIONING", + "VIEW", "VIEWS", "VOLATILE", "WHEN", "WHENEVER", "WHERE", "WHITESPACE", "WIDTH_BUCKET", + "WINDOW", "WITH", "WITHIN", "WITHOUT", "WORK", "WRAPPER", "WRITE", "XML", "XMLAGG", + "XMLATTRIBUTES", "XMLBINARY", "XMLCAST", "XMLCOMMENT", "XMLCONCAT", "XMLDECLARATION", + "XMLDOCUMENT", "XMLELEMENT", "XMLEXISTS", "XMLFOREST", "XMLITERATE", "XMLNAMESPACES", + "XMLPARSE", "XMLPI", "XMLQUERY", "XMLROOT", "XMLSCHEMA", "XMLSERIALIZE", "XMLTABLE", + "XMLTEXT", "XMLVALIDATE", "YEAR", "YES", "ZONE" +]; \ No newline at end of file diff --git a/lib/table-graph.js b/lib/table-graph.js new file mode 100644 index 0000000..549051b --- /dev/null +++ b/lib/table-graph.js @@ -0,0 +1,81 @@ +/** + * Returns an object keyed by table name with links to + * dependents and dependencies with the following structure: + * Graph { + * [Table.name]: { + * dependents: { + * [Table.name]: { + * [FromColumn]: 'ToColumn' + * } + * } + * dependencies: { + * [Table.name]: { + * [FromColumn]: 'ToColumn' + * } + * } + * } + * ... + * } + * + * NOTE: + * This is some old somewhat janky code. Could probably use a re-write. + * It should be used for calculated the sorting for creating tables and + * for use in the relationships middleware (should also be re-written);. + * + * @param {Object} tables An object of table instances keyed by table name + * @return {TableGraph} The graph of tables + */ +module.exports = tables => { + var graph = {}; + + for ( var key in tables ){ + graph[ key ] = { + dependents: {} + , dependencies: {} + }; + } + + Object + .keys( tables ) + // Filter down to dals whose schema contains a `references` key + .filter( table_name => { + var table = tables[ table_name ]; + + return Object + .keys( table.schema ) + .some( col_name => { + return table.schema[ col_name ].references; + }); + }) + .forEach( table_name => { + var table = tables[ table_name ]; + var source = graph[ table_name ]; + + Object + .keys( table.schema ) + .filter( function( col_name ){ + return table.schema[ col_name ].references; + }) + .forEach( col_name => { + var col = table.schema[ col_name ]; + var target = graph[ col.references.table ]; + + if ( target ) + if ( !target.dependents[ table_name ] ){ + target.dependents[ table_name ] = {}; + } + + if ( !source.dependencies[ col.references.table ] ){ + source.dependencies[ col.references.table ] = {}; + } + + if ( target ){ + target.dependents[ table_name ][ col.references.column ] = col_name; + } + + source.dependencies[ col.references.table ][ col_name ] = col.references.column; + }); + }); + + return graph; +}; \ No newline at end of file diff --git a/lib/table.js b/lib/table.js new file mode 100644 index 0000000..be1f941 --- /dev/null +++ b/lib/table.js @@ -0,0 +1,171 @@ +var _ = require('lodash'); +var QueryGenerator = require('./query-generator'); +var Query = require('./query'); +var ResultTransform = require('./result-transform'); + +class Table extends QueryGenerator { + static create( options ){ + return new this( options ); + } + + constructor( options ){ + super( options ); + + if ( typeof options.name !== 'string' ){ + throw new Error('options.name must be a table or view name string'); + } + + if ( options.type && [ 'table', 'view' ].indexOf( options.type ) === -1 ){ + throw new Error('options.type must either view "table" or "view"'); + } + + if ( options.type === 'view' ){ + if ( !(options.expression instanceof Query) ){ + throw new Error('options.expression must be a Query'); + } + + this.expression = options.expression; + this.materialized = options.materialized || false; + } + + this.type = options.type || 'table'; + + this.name = options.name; + this.schema = options.schema || {}; + } + + clone(){ + var options = Object.assign( {}, this.options ); + options.queryTransforms = this.queryTransforms.slice(0); + options.resultsTransforms = this.resultsTransforms.slice(0); + + options.schema = _.cloneDeep( this.schema ); + + return Table.create( options ); + } + + getPrimaryKey(){ + for ( var key in this.schema ){ + if ( this.schema[ key ].primaryKey === true ){ + return key; + } + } + + return Table.defaultPrimaryKey; + } + + getIdParamWhereClause( where ){ + if ( typeof where !== 'object' ){ + where = { [this.getPrimaryKey()]: where }; + } + + return where; + } + + create(){ + if ( this.type === 'view' ){ + return this.query({ + type: 'create-view' + , view: this.name + , orReplace: true + , materialized: this.materialized + , expression: this.expression.toStringAndValues().original + }); + } + + return this.query({ + type: 'create-table' + , ifNotExists: true + , table: this.name + , definition: this.schema + }); + } + + find( where, options ){ + return this.query({ + type: 'select' + , table: this.name + }).mutate( query => { + if ( typeof where === 'object' ){ + query.where( where ); + } + + if ( typeof options === 'object' ){ + Object.assign( query.mosqlQuery, options ); + } + }); + } + + findOne( where ){ + return this.query({ + type: 'select' + , table: this.name + , where: where ? this.getIdParamWhereClause( where ) : {} + , limit: 1 + }).mutate( query => query.after( ResultTransform.firstRow ) ); + } + + insert( values ){ + return this.query({ + type: 'insert' + , table: this.name + , values: values || {} + , returning: ['*'] + }).mutate( query => { + if (!Array.isArray(query.mosqlQuery.values)) { + return query.after( ResultTransform.firstRow ) + } + + return query + }); + } + + upsert( failingColumn, values = {} ){ + // Map update strategy to `excluded.*` so that values that are + // an array of objects still get upserted + var update = Query.getUpsertUpdateFromObject( values ); + + return this.query({ + type: 'insert' + , table: this.name + , values: values + , returning: ['*'] + , conflict: { + target: { column: failingColumn } + , action: { update } + } + }); + } + + remove( where ){ + var query = this.query({ + type: 'delete' + , table: this.name + , where: where ? this.getIdParamWhereClause( where ) : {} + }); + + if ( typeof where !== 'object' ){ + query.mutate( query => query.after( ResultTransform.firstRow ) ); + } + + return query; + } + + update( where ){ + var query = this.query({ + type: 'update' + , table: this.name + , where: where ? this.getIdParamWhereClause( where ) : {} + }); + + if ( typeof where !== 'object' ){ + query.mutate( query => query.after( ResultTransform.firstRow ) ); + } + + return query; + } +} + +Table.defaultPrimaryKey = 'id'; + +module.exports = Table; \ No newline at end of file diff --git a/lib/transaction.js b/lib/transaction.js new file mode 100644 index 0000000..ff8e730 --- /dev/null +++ b/lib/transaction.js @@ -0,0 +1,111 @@ +const Pool = require('pg-pool'); +const Query = require('./query'); +const Errors = require('./errors'); +const QueryDebugger = require('./query-debugger'); + +class Transaction { + static create( options ){ + return new this( options ); + } + + constructor( options = {} ){ + if ( !(options.pool instanceof Pool) ){ + throw new Errors.MissingPool(); + } + + this.options = options; + this.pool = options.pool; + this.results = {}; + } + + begin(){ + return this.pool.connect().then( client => { + this.client = client; + + return this.query('BEGIN'); + }); + } + + commit(){ + if ( !this.client ){ + throw new TransactionNotBegunError('commit'); + } + + return this.query('COMMIT').then( tx => { + tx.client.release(); + + return tx; + }); + } + + abort(){ + if ( !this.client ){ + throw new TransactionNotBegunError('abort'); + } + + return this.query('ABORT').then( tx => { + tx.client.release(); + + return tx; + }); + } + + save( savepoint ){ + if ( !this.client ){ + throw new TransactionNotBegunError('create savepoint on'); + } + + return this.query( 'SAVEPOINT ' + savepoint ); + } + + rollback( savepoint ){ + if ( !this.client ){ + throw new TransactionNotBegunError('rollback'); + } + + return this.query( 'ROLLBACK TO ' + savepoint ); + } + + query( query, values = [] ){ + if ( !this.client ){ + throw new TransactionNotBegunError('query'); + } + + if ( query instanceof Query ){ + query = query.toStringAndValues(); + values = query.values; + query = query.toString(); + } + + var qdebugger; + + if ( this.options.debug ){ + qdebugger = new QueryDebugger( query, values ); + } + + return new Promise( ( resolve, reject )=>{ + this.client.query( query, values, ( error, result ) => { + if ( this.options.debug ){ + qdebugger.log(); + qdebugger.end(); + } + + if ( error ){ + return reject( error ); + } + + resolve( this ); + }); + }); + } +} + +class TransactionNotBegunError extends Error { + constructor( action ){ + super(`Cannot ${action} the transaction because it has not yet begun. Call tx.begin() first.`); + } +} + +Transaction.TransactionNotBegunError = TransactionNotBegunError; + +module.exports = Transaction; \ No newline at end of file diff --git a/lib/utils.js b/lib/utils.js deleted file mode 100644 index aad65af..0000000 --- a/lib/utils.js +++ /dev/null @@ -1,13 +0,0 @@ -var async = require('async'); -var lodash = require('lodash'); -var Class = require('./class'); - -var utils = module.exports = {}; - -utils.noop = function(){}; - -utils.Class = Class; - -utils.async = async; - -lodash.extend( utils, lodash, async ); diff --git a/middleware/cast-to-json.js b/middleware/cast-to-json.js deleted file mode 100644 index 66d6d6c..0000000 --- a/middleware/cast-to-json.js +++ /dev/null @@ -1,86 +0,0 @@ -/** - * DB Middleware: Cast to JSON - * Automatically stringifies fields - * - * By default, the options are: - * - * { - * operations: ['insert', 'update'] - * , types: ['json'] - * } - * - * var dirac = require('dirac'); - * - * dirac.use( dirac.castToJSON({ - * operations: ['insert', 'update', 'myCustomUpdateFunction'] - * , types: ['json', 'my-special-json-type'] - * })); - */ - -var castToJSON = function( field ){ - var applyField = function( obj ){ - if ( typeof (obj || {})[ field ] == 'object' ){ - obj[ field ] = JSON.stringify( obj[ field ] ); - } - }; - - return function( $query, schema, next ){ - if ( $query.updates ) applyField( $query.updates ); - - if ( Array.isArray( $query.values ) ){ - $query.values.forEach( applyField ); - } else if ( $query.values ){ - applyField( $query.values ); - } - - next(); - }; -}; - -module.exports = function( options ){ - options = options || {}; - - var defaults = { - types: ['json'] - , operations: ['insert', 'update'] - }; - - for ( var key in defaults ){ - if ( !(key in options) ) options[ key ] = defaults[ key ]; - } - - var passesTypeCheck = function( dal, col ){ - col = dal.schema[ col ]; - return options.types.indexOf( col.type ) > -1; - }; - - return function( dirac ){ - // Filter down to dals that pass `options.type` - Object.keys( dirac.dals ).filter( function( dal ){ - dal = dirac.dals[ dal ]; - - return Object.keys( dal.schema ).some( function( col ){ - return passesTypeCheck( dal, col ); - }); - - // Convert to structure with columns to cast - }).map( function( dal ){ - dal = dirac.dals[ dal ]; - - return { - dal: dal - , columns: Object.keys( dal.schema ).filter( function( col ){ - return passesTypeCheck( dal, col ); - }) - }; - - // Each object, each operation, each field, apply middleware - }).forEach( function( obj ){ - options.operations.forEach( function( op ){ - obj.columns.forEach( function( col ){ - obj.dal.before( op, castToJSON( col ) ); - }); - }); - }); - }; -}; \ No newline at end of file diff --git a/middleware/dir.js b/middleware/dir.js deleted file mode 100644 index c265cf7..0000000 --- a/middleware/dir.js +++ /dev/null @@ -1,24 +0,0 @@ -/** - * DAL Directory - * Specify the directory where table schemas/dal defs live - * - * dirac.use( dirac.dir( __dirname + '/dals' ) ) - */ - -var fs = require('fs'); -var path = require('path'); - -module.exports = function( dir ){ - if ( !dir ) throw new Error('Dirac.middleware.directory - Missing first argument'); - - return function( dirac ){ - fs.readdirSync( dir ).filter( function( file ){ - return fs.statSync( path.join( dir, file ) ).isFile(); - }).map( function( file ){ - return require( path.join( dir, file ) ); - }).forEach( function( dal ){ - dirac.register( dal ); - dirac.instantiateDal( dal.name ); - }); - }; -}; \ No newline at end of file diff --git a/middleware/embeds.js b/middleware/embeds.js deleted file mode 100644 index b0bcf3b..0000000 --- a/middleware/embeds.js +++ /dev/null @@ -1,104 +0,0 @@ -/** - * Dirac Embeds - * Helps facilitate embedding foreign objects - * - * options: { - * // The methods to apply `after` filters to for embeds - * operations: ['findOne'] - * } - * - * How to use: - * dirac.register({ - * name: 'user' - * , schema: { - * id: { type: 'serial', primaryKey: true } - * , name: { type: 'text' } - * } - * - * , defaultEmbeds: { - * groups: true - * } - * - * // Describes how to embed foreign tables - * , embeds: { - * // Naive example, but shows how to use it - * groups: function( results, $query, callback ){ - * if ( results.length === 0 ) return callback(); - * - * dirac.dals.groups.find({ user_id: results[0].id }, callback ); - * } - * } - * }); - */ - -var async = require('async'); -var _ = require('lodash'); - -module.exports = function( options ){ - options = options || {}; - - var defaults = { - operations: ['findOne'] - }; - - for ( var key in defaults ){ - if ( !(key in options) ) options[ key ] = defaults[ key ]; - } - - return function( dirac ){ - var embedFn = function( table ){ - var dal = dirac.dals[ table ]; - - return function( results, $query, schema, next ){ - if ( !dal ) return next(); - if ( !('embeds' in dal ) ) return next(); - - var key; - - if ( typeof !$query.embeds && !dal.defaultEmbeds ){ - return next(); - } - - $query.embeds = $query.embeds || {}; - - _.defaults( $query.embeds, dal.defaultEmbeds ); - - // Filter down to embeds that we can run - var keys = Object.keys( $query.embeds ).filter( function( key ){ - return key in dal.embeds; - }); - - var fns = keys.map( function( key ){ - return function( done ){ - dal.embeds[ key ]( results, $query, done ); - }; - }); - - fns = _.object( keys, fns ); - - async.parallel( fns, function( error, result ){ - if ( error ) return next( error ); - - // Copy embed results to each original result item - for ( var i = 0, l = results.length; i < l; ++i ){ - for ( var key in result ){ - results[ i ][ key ] = result[ key ]; - } - } - - next(); - }); - }; - }; - - // Filter down to dals that have embeds - Object.keys( dirac.dals ).filter( function( dal ){ - return 'embeds' in dirac.dals[ dal ]; - // Register embedFn - }).forEach( function( table ){ - options.operations.forEach( function( op ){ - dirac.dals[ table ].after( op, embedFn( table ) ); - }); - }); - }; -}; \ No newline at end of file diff --git a/middleware/relationships.js b/middleware/relationships.js index 353fc0e..f9318a4 100644 --- a/middleware/relationships.js +++ b/middleware/relationships.js @@ -1,476 +1,610 @@ -var utils = require('lodash'); -var mosqlUtils = require('mongo-sql/lib/utils'); - -module.exports = function( options ){ - var relationships = function( dirac ){ - // Extend the DAL to add necessary dependents/dependencies cache - // Setup cached dependency graph for use by relationship helpers - var init = dirac.DAL.prototype.initialize; - dirac.DAL = dirac.DAL.extend({ - initialize: function(){ - this.dependents = {}; - this.dependencies = {}; - return init.apply( this, arguments ); +const _ = require('lodash'); +const mosqlUtils = require('mongo-sql/lib/utils'); +const Query = require('../lib/query'); +const QueryTransform = require('../lib/query-transform'); +const Database = require('../lib/database'); + +class RelationshipsDatabase extends Database { + clone(){ + var originalClone = Database.prototype.clone.call( this ); + var clone = new RelationshipsDatabase( originalClone ); + + for ( var key in originalClone ){ + clone[ key ] = originalClone[ key ]; + } + + return clone; + } + + register( table ){ + return Database.prototype.register.call( this, table ).mutate( database => { + var transform = Relationships.Transform( database.graph ); + + replaceTransform( database, transform ); + + // Update all tables to use the new transform + for ( var key in database.tables ){ + database.tables[ key ].mutate( table => { + replaceTransform( table, transform ); + }); } }); + } +} - // Push to bottom of stack - dirac.use( function( dirac ){ - dirac.use( function( dirac ){ - // Filter down to dals whose schema contains a `references` key - Object.keys( dirac.dals ).filter( function( table_name ){ - var dal = dirac.dals[ table_name ]; +class QueryWithRelationships extends Query { + one( subQuery ){ + var query = this.instance(); - return Object.keys( dal.schema ).some( function( col_name ){ - return dal.schema[ col_name ].references; - }); - }).forEach( function( table_name ){ - var dal = dirac.dals[ table_name ]; + if ( typeof subQuery == 'string' ){ + subQuery = { table: subQuery }; + } - Object.keys( dal.schema ).filter( function( col_name ){ - return dal.schema[ col_name ].references; - }).forEach( function( col_name ){ - var col = dal.schema[ col_name ]; - var target = dirac.dals[ col.references.table ]; + if ( !Array.isArray( query.mosqlQuery.one ) ){ + query.mosqlQuery.one = []; + } - if ( !target.dependents[ table_name ] ){ - target.dependents[ table_name ] = {}; - } + query.mosqlQuery.one.push( subQuery ); - if ( !dal.dependencies[ col.references.table ] ){ - dal.dependencies[ col.references.table ] = {}; - } + return query; + } - target.dependents[ table_name ][ col.references.column ] = col_name; - dal.dependencies[ col.references.table ][ col_name ] = col.references.column; - }); - }); - }); + many( subQuery ){ + var query = this.instance(); - // Cache incoming/outgoing dependencies - dirac.use( function( dirac ){ - var applyOne = function( table_name, $query ){ - var tmpl = function( data ){ - var where = utils.extend( {}, data.where ); - - data.pivots.forEach( function( p ){ - where[ p.target_col ] = '$' + mosqlUtils.quoteObject( p.source_col, data.source ) + '$'; - }); - - var main = utils.extend({ - type: 'select' - , table: data.target - , alias: data.qAlias - , where: where - , limit: 1 - }, utils.omit( data, ['table', 'alias', 'pivots', 'target', 'source', 'where'] )); - - if ( Array.isArray( main.one ) ){ - main.one.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyOne( main.table, main ); - } + if ( typeof subQuery == 'string' ){ + subQuery = { table: subQuery }; + } - if ( Array.isArray( main.many ) ){ - main.many.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyMany( main.table, main ); - } + if ( !Array.isArray( query.mosqlQuery.many ) ){ + query.mosqlQuery.many = []; + } - if ( Array.isArray( main.pluck ) ){ - main.pluck.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyPluck( main.table, main ); - } + query.mosqlQuery.many.push( subQuery ); - if ( Array.isArray( main.mixin ) ){ - main.mixin.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyMixin( main.table, main ); - } + return query; + } - return { - type: 'expression' - , alias: data.alias - , expression: { - parenthesis: true - , expression: { - type: 'select' - , columns: [{ type: 'row_to_json', expression: data.qAlias }] - , table: main - } - } - }; - }; + mixin( subQuery ){ + var query = this.instance(); - $query.one.forEach( function( target ){ - var targetDal = dirac.dals[ target.table ]; + if ( typeof subQuery == 'string' ){ + subQuery = { table: subQuery }; + } - // Immediate dependency not met and not specifying how to get there - if ( !targetDal && !target.where ){ - throw new Error( 'Must specify how to relate table `' + table_name + '` to target `' + target.table + '`' ); - } + if ( !Array.isArray( query.mosqlQuery.mixin ) ){ + query.mosqlQuery.mixin = []; + } - var pivots = []; + query.mosqlQuery.mixin.push( subQuery ); - if ( targetDal ) - if ( targetDal.dependents[ table_name ] ){ - pivots = Object.keys( targetDal.dependents[ table_name ] ).map( function( p ){ - return { - source_col: targetDal.dependents[ table_name ][ p ] - , target_col: p - }; - }); - } + return query; + } - if ( targetDal ) - if ( targetDal.dependencies[ table_name ] ){ - pivots = pivots.concat( Object.keys( targetDal.dependencies[ table_name ] ).map( function( p ){ - return { - source_col: targetDal.dependencies[ table_name ][ p ] - , target_col: p - }; - })); - } + pluck( subQuery ){ + var query = this.instance(); - var context = utils.extend({ - source: target.source || table_name - , target: target.table - , alias: target.alias || target.table - , pivots: pivots - , qAlias: 'r' - }, target ); + if ( typeof subQuery == 'string' ){ + subQuery = { table: subQuery }; + } - context.alias = context.alias || target.table; + if ( !Array.isArray( query.mosqlQuery.pluck ) ){ + query.mosqlQuery.pluck = []; + } - var columnsTarget = $query.type === 'select' ? 'columns' : 'returning'; + query.mosqlQuery.pluck.push( subQuery ); - if ( !$query[ columnsTarget ] ){ - $query[ columnsTarget ] = ['*']; - } + return query; + } +} - $query[ columnsTarget ].push( tmpl( context ) ); - }); - }; +class RelationshipsQueryTransform extends QueryTransform {} - var applyMixin = function( table_name, $query ){ - var cid = 1; - var tmpl = function( data ){ - var where = utils.extend( {}, data.where ); - var on = utils.extend( {}, data.on ); - - data.pivots.forEach( function( p ){ - /*where[ p.target_col ] = */on[ p.target_col ] = '$' + mosqlUtils.quoteObject( p.source_col, data.source ) + '$'; - }); - - var main = utils.extend({ - type: 'select' - , table: data.target - , alias: data.qAlias - , where: where - , limit: 1 - }, utils.omit( data, ['table', 'alias', 'pivots', 'target', 'source', 'where', 'on'] )); - - if ( Array.isArray( main.one ) ){ - main.one.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyOne( main.table, main ); - } +var Relationships = module.exports = options => { + return database => { + var db = new RelationshipsDatabase( Object.assign( {}, database, database.options ) ); + db.Query = QueryWithRelationships; + return db; + }; +}; + +module.exports.Transform = graph => { + return RelationshipsQueryTransform.create( query => { + let tableName = query.table(); + let $query = query.mosqlQuery; + + if ( Array.isArray( $query.many ) ){ + applyMany( graph, tableName, $query ); + } + + if ( Array.isArray( $query.one ) ){ + applyOne( graph, tableName, $query ); + } + + if ( Array.isArray( $query.pluck ) ){ + applyPluck( graph, tableName, $query ); + } + + if ( Array.isArray( $query.mixin ) ){ + applyMixin( graph, tableName, $query ); + } + + return query; + }); +}; + +module.exports.Query = QueryWithRelationships; +module.exports.QueryTransform = RelationshipsQueryTransform; + +var replaceTransform = ( generator, transform )=>{ + // Replace the old relationships transform + var didReplace = false; + for ( let i = 0; i < generator.queryTransforms.length; i++ ){ + if ( generator.queryTransforms[ i ] instanceof Relationships.QueryTransform ){ + didReplace = true; + generator.queryTransforms[ i ] = transform; + break; + } + } + + if ( !didReplace ){ + generator.use( transform ); + } +}; + +var applyOne = function( graph, table_name, $query ){ + var tmpl = function( data ){ + var where = _.extend( {}, data.where ); + + data.pivots.forEach( function( p ){ + where[ p.target_col ] = '$' + mosqlUtils.quoteObject( p.source_col, data.source ) + '$'; + }); - if ( Array.isArray( main.many ) ){ - main.many.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyMany( main.table, main ); - } + var main = _.extend({ + type: 'select' + , table: data.target + , alias: data.qAlias + , where: where + , limit: 1 + }, _.omit( data, ['table', 'alias', 'pivots', 'target', 'source', 'where'] )); + + if ( Array.isArray( main.one ) ){ + main.one.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyOne( graph, main.table, main ); + } + + if ( Array.isArray( main.many ) ){ + main.many.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyMany( graph, main.table, main ); + } + + if ( Array.isArray( main.pluck ) ){ + main.pluck.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyPluck( graph, main.table, main ); + } + + if ( Array.isArray( main.mixin ) ){ + main.mixin.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyMixin( graph, main.table, main ); + } + + return { + type: 'expression' + , alias: data.alias + , expression: { + parenthesis: true + , expression: { + type: 'select' + , columns: [{ type: 'row_to_json', expression: data.qAlias }] + , table: main + } + } + }; + }; - if ( Array.isArray( main.pluck ) ){ - main.pluck.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyPluck( main.table, main ); - } + $query.one.forEach( function( target ){ + var targetDal = graph[ target.table ]; - if ( Array.isArray( main.mixin ) ){ - main.mixin.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyMixin( main.table, main ); - } + // Immediate dependency not met and not specifying how to get there + if ( !targetDal && !target.where ){ + throw new Error( 'Must specify how to relate table `' + table_name + '` to target `' + target.table + '`' ); + } + var pivots = []; + + if ( targetDal ) + if ( targetDal.dependents[ table_name ] ){ + pivots = Object.keys( targetDal.dependents[ table_name ] ).map( function( p ){ + return { + source_col: targetDal.dependents[ table_name ][ p ] + , target_col: p + }; + }); + } + + if ( targetDal ) + if ( targetDal.dependencies[ table_name ] ){ + pivots = pivots.concat( Object.keys( targetDal.dependencies[ table_name ] ).map( function( p ){ + return { + source_col: targetDal.dependencies[ table_name ][ p ] + , target_col: p + }; + })); + } + + var context = _.extend({ + source: target.source || table_name + , target: target.table + , alias: target.alias || target.table + , pivots: pivots + , qAlias: 'r' + }, target ); + + context.alias = context.alias || target.table; + + var columnsTarget = $query.type === 'select' ? 'columns' : 'returning'; + + if ( !$query[ columnsTarget ] ){ + $query[ columnsTarget ] = ['*']; + } + + $query[ columnsTarget ].push( tmpl( context ) ); + }); +}; + +var applyMixin = function(graph, table_name, $query ){ + var getResults = (table, mixins) => { + return mixins + .map(mixin => { + var targetDal = graph[ mixin.table ]; + var where = _.extend( {}, mixin.where ); + var on = _.extend( {}, mixin.on ); + + // Immediate dependency not met and not specifying how to get there + if ( !targetDal && !mixin.where ){ + throw new Error( 'Must specify how to relate table `' + table + '` to target `' + mixin.table + '`' ); + } + + var pivots = []; + + if ( targetDal ) + if ( targetDal.dependents[ table ] ){ + pivots = Object.keys( targetDal.dependents[ table ] ).map( function( p ){ return { - type: 'left' - , alias: data.qalias - , target: data.target - , on: on + source_col: targetDal.dependents[ table ][ p ] + , target_col: p }; - }; - - $query.mixin.forEach( function( target ){ - var targetDal = dirac.dals[ target.table ]; - - // Immediate dependency not met and not specifying how to get there - if ( !targetDal && !target.where ){ - throw new Error( 'Must specify how to relate table `' + table_name + '` to target `' + target.table + '`' ); - } - - var pivots = []; - - if ( targetDal ) - if ( targetDal.dependents[ table_name ] ){ - pivots = Object.keys( targetDal.dependents[ table_name ] ).map( function( p ){ - return { - source_col: targetDal.dependents[ table_name ][ p ] - , target_col: p - }; - }); - } - - if ( targetDal ) - if ( targetDal.dependencies[ table_name ] ){ - pivots = pivots.concat( Object.keys( targetDal.dependencies[ table_name ] ).map( function( p ){ - return { - source_col: targetDal.dependencies[ table_name ][ p ] - , target_col: p - }; - })); - } - - var context = utils.extend({ - source: target.source || table_name - , target: target.table - , alias: target.alias || target.table - , pivots: pivots - , qAlias: 'r' - }, target ); - - context.alias = context.alias || target.table; - - if ( !$query.joins ){ - $query.joins = []; - } - - var columnsTarget = $query.type === 'select' ? 'columns' : 'returning'; - - if ( !$query[ columnsTarget ] ){ - $query[ columnsTarget ] = ['*']; - } - - if ( Array.isArray( target.columns ) ){ - $query[ columnsTarget ] = $query[ columnsTarget ].concat( - target.columns.map( function( column ){ - return typeof column === 'string' - ? mosqlUtils.quoteObject( column, target.table ) - : column; - }) - ); - } else { - $query[ columnsTarget ].push({ table: context.alias, name: '*' }) - } - - - $query.joins.push( tmpl( context ) ); }); - }; - - var applyMany = function( table_name, $query ){ - var tmpl = function( data ){ - var where = utils.extend( {}, data.where ); - - data.pivots.forEach( function( p ){ - where[ p.target_col ] = '$' + mosqlUtils.quoteObject( p.source_col, data.source ) + '$'; - }); - - var main = utils.extend({ - type: 'select' - , table: data.target - , where: where - , alias: data.qAlias - }, utils.omit( data, ['table', 'alias', 'pivots', 'target', 'source', 'where'] )); - - if ( Array.isArray( main.one ) ){ - main.one.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyOne( main.table, main ); - } - - if ( Array.isArray( main.many ) ){ - main.many.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyMany( main.table, main ); - } - - if ( Array.isArray( main.pluck ) ){ - main.pluck.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyPluck( main.table, main ); - } - - if ( Array.isArray( main.mixin ) ){ - main.mixin.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyMixin( main.table, main ); - } + } + if ( targetDal ) + if ( targetDal.dependencies[ table ] ){ + pivots = pivots.concat( Object.keys( targetDal.dependencies[ table ] ).map( function( p ){ return { - type: 'expression' - , alias: data.alias - , expression: { - parenthesis: true - , expression: { - type: 'array_to_json' - , expression: { - type: 'array' - , expression: { - type: 'select' - , columns: [{ type: 'row_to_json', expression: data.qAlias }] - , table: main - } - } - } - } + source_col: targetDal.dependencies[ table ][ p ] + , target_col: p }; - }; + })); + } - $query.many.forEach( function( target ){ - var targetDal = dirac.dals[ target.table ]; + pivots.forEach( function( p ){ + on[ p.target_col ] = '$' + mosqlUtils.quoteObject( p.source_col, table ) + '$'; + }); - // Immediate dependency not met and not specifying how to get there - if ( !targetDal && !target.where ){ - throw new Error( 'Must specify how to relate table `' + table_name + '` to target `' + target.table + '`' ); - } + var join = { + type: 'left' + , alias: mixin.alias + , target: mixin.table + , on: on + }; - var pivots = []; + var joins = [join] + var columns; + + if ( Array.isArray( mixin.columns ) ){ + columns = mixin.columns.map( function( column ){ + return typeof column === 'string' + ? mosqlUtils.quoteObject( column, mixin.table ) + : column; + }) + } else { + columns = [{ table: mixin.alias || mixin.table, name: '*' }] + } + + if (Array.isArray(mixin.mixin)) { + const subResult = getResults(mixin.table, mixin.mixin) + joins = joins.concat(subResult.joins) + columns = columns.concat(subResult.columns) + } + + return { joins, columns } + }) + // Flatten nested mixins + .reduce((final, individual) => { + final.joins = final.joins.concat(individual.joins) + final.columns = final.columns.concat(individual.columns) + return final + }, { columns: [], joins: [] }) + } + + var results = getResults(table_name, $query.mixin) + $query.joins = ($query.joins || []).concat( results.joins ); + $query.columns = ($query.columns || ['*']).concat( results.columns ); +}; + +var applyMixinOld = function( graph, table_name, $query ){ + var tmpl = function( data ){ + var where = _.extend( {}, data.where ); + var on = _.extend( {}, data.on ); + + data.pivots.forEach( function( p ){ + on[ p.target_col ] = '$' + mosqlUtils.quoteObject( p.source_col, data.source ) + '$'; + }); - if ( targetDal ) - if ( targetDal.dependencies[ table_name ] ){ - pivots = Object.keys( targetDal.dependencies[ table_name ] ).map( function( p ){ - return { - source_col: targetDal.dependencies[ table_name ][ p ] - , target_col: p - }; - }); - } + var results = [] - var context = utils.extend({ - source: target.source || table_name - , target: target.table - , alias: target.alias || target.table - , pivots: pivots - , qAlias: 'r' - }, target ); + if ( Array.isArray( data.mixin ) ){ + applyMixin( graph, data.target, data ); + } - context.alias = context.alias || target.table; + return { + type: 'left' + , alias: data.qalias + , target: data.target + , on: on + }; + }; - var columnsTarget = $query.type === 'select' ? 'columns' : 'returning'; + $query.mixin.forEach( function( target ){ + var targetDal = graph[ target.table ]; - if ( !$query[ columnsTarget ] ){ - $query[ columnsTarget ] = ['*']; - } + // Immediate dependency not met and not specifying how to get there + if ( !targetDal && !target.where ){ + throw new Error( 'Must specify how to relate table `' + table_name + '` to target `' + target.table + '`' ); + } - $query[ columnsTarget ].push( tmpl( context ) ); - }); + var pivots = []; + + if ( targetDal ) + if ( targetDal.dependents[ table_name ] ){ + pivots = Object.keys( targetDal.dependents[ table_name ] ).map( function( p ){ + return { + source_col: targetDal.dependents[ table_name ][ p ] + , target_col: p + }; + }); + } + + if ( targetDal ) + if ( targetDal.dependencies[ table_name ] ){ + pivots = pivots.concat( Object.keys( targetDal.dependencies[ table_name ] ).map( function( p ){ + return { + source_col: targetDal.dependencies[ table_name ][ p ] + , target_col: p }; + })); + } + + var context = _.extend({ + source: target.source || table_name + , target: target.table + , alias: target.alias || target.table + , pivots: pivots + , qAlias: 'r' + }, target ); + + context.alias = context.alias || target.table; + + if ( !$query.joins ){ + $query.joins = []; + } + + var columnsTarget = $query.type === 'select' ? 'columns' : 'returning'; + + if ( !$query[ columnsTarget ] ){ + $query[ columnsTarget ] = ['*']; + } + + if ( Array.isArray( target.columns ) ){ + $query[ columnsTarget ] = $query[ columnsTarget ].concat( + target.columns.map( function( column ){ + return typeof column === 'string' + ? mosqlUtils.quoteObject( column, target.table ) + : column; + }) + ); + } else { + $query[ columnsTarget ].push({ table: context.alias, name: '*' }) + } + + + $query.joins = $query.joins.concat( tmpl( context ) ); + }); +}; + +var applyMany = function( graph, table_name, $query ){ + var tmpl = function( data ){ + var where = _.extend( {}, data.where ); + + data.pivots.forEach( function( p ){ + where[ p.target_col ] = '$' + mosqlUtils.quoteObject( p.source_col, data.source ) + '$'; + }); - var applyPluck = function( table_name, $query ){ - var tmpl = function( data ){ - var where = utils.extend( {}, data.where ); + var main = _.extend({ + type: 'select' + , table: data.target + , where: where + , alias: data.qAlias + }, _.omit( data, ['table', 'alias', 'pivots', 'target', 'source', 'where'] )); + + if ( Array.isArray( main.one ) ){ + main.one.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyOne( graph, main.table, main ); + } + + if ( Array.isArray( main.many ) ){ + main.many.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyMany( graph, main.table, main ); + } + + if ( Array.isArray( main.pluck ) ){ + main.pluck.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyPluck( graph, main.table, main ); + } + + if ( Array.isArray( main.mixin ) ){ + main.mixin.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyMixin( graph, main.table, main ); + } + + return { + type: 'expression' + , alias: data.alias + , expression: { + parenthesis: true + , expression: { + type: 'array_to_json' + , expression: { + type: 'array' + , expression: { + type: 'select' + , columns: [{ type: 'row_to_json', expression: data.qAlias }] + , table: main + } + } + } + } + }; + }; - data.pivots.forEach( function( p ){ - where[ p.target_col ] = '$' + mosqlUtils.quoteObject( p.source_col, data.source ) + '$'; - }); + $query.many.forEach( function( target ){ + var targetDal = graph[ target.table ]; - var main = utils.extend({ - type: 'select' - , table: data.target - , where: where - , alias: data.qAlias - }, utils.omit( data, ['table', 'alias', 'pivots', 'target', 'source', 'where', 'column'] )); + // Immediate dependency not met and not specifying how to get there + if ( !targetDal && !target.where ){ + throw new Error( 'Must specify how to relate table `' + table_name + '` to target `' + target.table + '`' ); + } - if ( Array.isArray( main.one ) ){ - main.one.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyOne( main.table, main ); - } + var pivots = []; - if ( Array.isArray( main.many ) ){ - main.many.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyMany( main.table, main ); - } + if ( targetDal ) + if ( targetDal.dependencies[ table_name ] ){ + pivots = Object.keys( targetDal.dependencies[ table_name ] ).map( function( p ){ + return { + source_col: targetDal.dependencies[ table_name ][ p ] + , target_col: p + }; + }); + } - if ( Array.isArray( main.pluck ) ){ - main.pluck.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyPluck( main.table, main ); - } + var context = _.extend({ + source: target.source || table_name + , target: target.table + , alias: target.alias || target.table + , pivots: pivots + , qAlias: 'r' + }, target ); - if ( Array.isArray( main.mixin ) ){ - main.mixin.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); - applyMixin( main.table, main ); - } + context.alias = context.alias || target.table; - return { - type: 'expression' - , alias: data.alias - , expression: { - parenthesis: true - , expression: { - type: 'array' - , expression: { - type: 'select' - , columns: [ data.column ] - , table: main - } - } - } - }; - }; + var columnsTarget = $query.type === 'select' ? 'columns' : 'returning'; - $query.pluck.forEach( function( target ){ - var targetDal = dirac.dals[ target.table ]; + if ( !$query[ columnsTarget ] ){ + $query[ columnsTarget ] = ['*']; + } - // Immediate dependency not met and not specifying how to get there - if ( !targetDal && !target.where ){ - throw new Error( 'Must specify how to relate table `' + table_name + '` to target `' + target.table + '`' ); - } + $query[ columnsTarget ].push( tmpl( context ) ); + }); +}; - var pivots = []; +var applyPluck = function( graph, table_name, $query ){ + var tmpl = function( data ){ + var where = _.extend( {}, data.where ); - if ( targetDal ) - if ( targetDal.dependencies[ table_name ] ){ - pivots = Object.keys( targetDal.dependencies[ table_name ] ).map( function( p ){ - return { - source_col: targetDal.dependencies[ table_name ][ p ] - , target_col: p - }; - }); - } + data.pivots.forEach( function( p ){ + where[ p.target_col ] = '$' + mosqlUtils.quoteObject( p.source_col, data.source ) + '$'; + }); - var context = utils.extend({ - source: target.source || table_name - , target: target.table - , alias: target.alias || target.table - , pivots: pivots - , qAlias: 'r' - }, target ); + var main = _.extend({ + type: 'select' + , table: data.target + , where: where + , alias: data.qAlias + }, _.omit( data, ['table', 'alias', 'pivots', 'target', 'source', 'where', 'column'] )); + + if ( Array.isArray( main.one ) ){ + main.one.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyOne( graph, main.table, main ); + } + + if ( Array.isArray( main.many ) ){ + main.many.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyMany( graph, main.table, main ); + } + + if ( Array.isArray( main.pluck ) ){ + main.pluck.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyPluck( graph, main.table, main ); + } + + if ( Array.isArray( main.mixin ) ){ + main.mixin.forEach( function( t ){ t.qAlias = t.qAlias || (data.qAlias + 'r'); }); + applyMixin( graph, main.table, main ); + } + + return { + type: 'expression' + , alias: data.alias + , expression: { + parenthesis: true + , expression: { + type: 'array' + , expression: { + type: 'select' + , columns: [ data.column ] + , table: main + } + } + } + }; + }; - context.alias = context.alias || target.table; + $query.pluck.forEach( function( target ){ + var targetDal = graph[ target.table ]; - var columnsTarget = $query.type === 'select' ? 'columns' : 'returning'; + // Immediate dependency not met and not specifying how to get there + if ( !targetDal && !target.where ){ + throw new Error( 'Must specify how to relate table `' + table_name + '` to target `' + target.table + '`' ); + } - if ( !$query[ columnsTarget ] ){ - $query[ columnsTarget ] = ['*']; - } + var pivots = []; - $query[ columnsTarget ].push( tmpl( context ) ); - }); + if ( targetDal ) + if ( targetDal.dependencies[ table_name ] ){ + pivots = Object.keys( targetDal.dependencies[ table_name ] ).map( function( p ){ + return { + source_col: targetDal.dependencies[ table_name ][ p ] + , target_col: p }; + }); + } - var options = { - operations: ['find', 'findOne', 'update', 'insert'] - }; + var context = _.extend({ + source: target.source || table_name + , target: target.table + , alias: target.alias || target.table + , pivots: pivots + , qAlias: 'r' + }, target ); - Object.keys( dirac.dals ).forEach( function( table_name ){ - var dal = dirac.dals[ table_name ]; - - options.operations.forEach( function( op ){ - dal.before( op, function( $query, schema, next ){ - if ( Array.isArray( $query.many ) ) applyMany( table_name, $query ); - if ( Array.isArray( $query.one ) ) applyOne( table_name, $query ); - if ( Array.isArray( $query.pluck ) ) applyPluck( table_name, $query ); - if ( Array.isArray( $query.mixin ) ) applyMixin( table_name, $query ); - return next(); - }); - }); - }); - }); - }); - }; + context.alias = context.alias || target.table; + + var columnsTarget = $query.type === 'select' ? 'columns' : 'returning'; - relationships.__immediate = true; + if ( !$query[ columnsTarget ] ){ + $query[ columnsTarget ] = ['*']; + } - return relationships; + $query[ columnsTarget ].push( tmpl( context ) ); + }); }; \ No newline at end of file diff --git a/middleware/table-ref.js b/middleware/table-ref.js deleted file mode 100644 index 1971f74..0000000 --- a/middleware/table-ref.js +++ /dev/null @@ -1,46 +0,0 @@ -/** - * Dirac Table Referencing - - * Automatically use a column for table references in schemas. - * - * { - * name: 'groups' - * , definition: { - * id: { type: 'serial', primaryKey: true } - * , name: { type: 'text' } - * , uid: { type: 'integer', references: { table: 'users' } } - * } - * } - */ - -var defaults = { - column: 'id' -}; - -module.exports = function( options ){ - options = options || {}; - - for ( var key in defaults ){ - if ( key in options ) continue; - options[ key ] = defaults[ key ]; - } - - return function( dirac ){ - // Adds default column ref to col - var addColumnRef = function( dal, col ){ - var column = dirac.dals[ dal ].schema[ col ]; - if ( !('references' in column) ) return; - if ( typeof column.references !== 'object' || 'column' in column.references ) return; - - column.references.column = options.column; - }; - - // Adds default column refs to dal - var addRefs = function( dal ){ - Object.keys( dirac.dals[ dal ].schema ).forEach( function( col ){ - addColumnRef( dal, col ); - }); - }; - - Object.keys( dirac.dals ).forEach( addRefs ); - }; -}; \ No newline at end of file diff --git a/package.json b/package.json index 4a7b096..00367dc 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "dirac", - "version": "0.4.17", + "version": "1.0.0-rc-2", "description": "Node-PG database layer built with MoSQL", "main": "index.js", "directories": { @@ -24,13 +24,17 @@ "gitHead": "f49aed817b0db67e83e2e359ed43d3c2a434992b", "readmeFilename": "README.md", "dependencies": { - "mongo-sql": "~2.x.x", - "async": "~0.2.9", - "pg": "*", - "lodash": "~1.3.1" + "chalk": "^1.1.3", + "lodash": "^4.13.1", + "mongo-sql": "^3.0.0", + "pg-connection-string": "^0.1.3", + "pg-pool": "^1.0.1", + "pg-query-stream": "^1.0.0", + "pluralize": "^2.0.0", + "sqlformatter": "^0.1.0" }, "peerDependencies": {}, "devDependencies": { - "mocha": "~1.12.0" + "mocha": "^2.5.3" } } diff --git a/test/database.js b/test/database.js new file mode 100644 index 0000000..b1da5be --- /dev/null +++ b/test/database.js @@ -0,0 +1,181 @@ +var assert = require('assert'); +var PGPool = require('pg-pool'); +var Database = require('../lib/database'); +var Table = require('../lib/table'); + +describe('Database', ()=>{ + it('Database.create()', ()=>{ + var database = Database.create(); + + assert.equal( + database.clientOptions.toString() + , `postgres://${process.env.USER}@localhost:5432/${process.env.USER}` + ); + + assert( database.pool instanceof PGPool ); + + database = Database.create('postgres://localhost:5432/test'); + + assert.equal( + database.clientOptions.toString() + , 'postgres://john@localhost:5432/test' + ); + + database = Database.create({ + user: 'foo' + , host: 'bar' + , database: 'baz' + }); + + assert.equal( + database.clientOptions.toString() + , 'postgres://foo@bar:5432/baz' + ); + }); + + it('.table(...)', ()=>{ + var db = Database.create(); + + var table1 = db.table('table1'); + assert( table1 instanceof Table ); + assert.equal( table1.name, 'table1' ); + + var table2 = db.table({ name: 'table2' }); + + var db2 = db.before( query => query.where('foo', 'bar') ); + var table3 = db2.table({ name: 'table3' }); + + assert.equal( table3.queryTransforms.length, 1 ); + }); + + it('.adaptTable(...)', ()=>{ + var db = Database.create().database('testtest'); + + class CustomTable extends Table { + foo(v){ + return this.query().where('foo', v) + } + } + + var customTable1 = new CustomTable({ + name: 'custom_table_1' + , pool: new PGPool({ database: 'test' }) + }); + + var customTable2 = db.adaptTable( customTable1 ); + + assert.equal( customTable1.pool.options.database, 'test' ); + assert.equal( customTable2.pool.options.database, 'testtest' ); + }) + + it('.register(...)', ()=>{ + var db = Database.create(); + var table1 = db.table('table1'); + + var db2 = db.register( table1 ); + + assert.deepEqual( db.tables, {} ); + assert( 'table1' in db2 ); + + var db3 = db2.register({ name: 'table2' }); + + assert.deepEqual( db.tables, {} ); + assert( 'table1' in db3 ); + assert( 'table2' in db3 ); + assert( !('table2' in db2) ); + + var db4 = db3.register('table3'); + + assert( 'table1' in db4 ); + assert( 'table2' in db4 ); + assert( 'table3' in db4 ); + }); + + it('.getCreateTableAndViewList()', ()=>{ + var db = Database.create() + .register({ + name: 'users' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + }) + .register({ + name: 'books' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + }) + .register({ + name: 'user_books' + , schema: { + user_id: { type: 'int', references: { table: 'users', column: 'id' } } + , book_id: { type: 'int', references: { table: 'books', column: 'id' } } + } + }); + + var list = db.getCreateTableAndViewList(); + + assert.deepEqual( list, [ + 'books' + , 'users' + , 'user_books' + ]); + }); + + it('.host(val)', ()=>{ + var database = Database.create().host('foobar'); + + assert.equal( + database.clientOptions.toString() + , `postgres://${process.env.USER}@foobar:5432/${process.env.USER}` + ); + + assert.equal( database.pool.options.host, 'foobar' ); + }); + + it('.port(val)', ()=>{ + var database = Database.create().port(1234); + + assert.equal( + database.clientOptions.toString() + , `postgres://${process.env.USER}@localhost:1234/${process.env.USER}` + ); + + assert.equal( database.pool.options.port, 1234 ); + }); + + it('.user(val)', ()=>{ + var database = Database.create().user('foobar'); + + assert.equal( + database.clientOptions.toString() + , `postgres://foobar@localhost:5432/${process.env.USER}` + ); + + assert.equal( database.pool.options.user, 'foobar' ); + }); + + it('.database(val)', ()=>{ + var database = Database.create().database('foobar'); + + assert.equal( + database.clientOptions.toString() + , `postgres://${process.env.USER}@localhost:5432/foobar` + ); + + assert.equal( database.pool.options.database, 'foobar' ); + }); + + it('.ssl(val)', ()=>{ + var database = Database.create().ssl(true); + + assert.equal( + database.clientOptions.toString() + , `postgres://${process.env.USER}@localhost:5432/${process.env.USER}?ssl=true` + ); + + assert.equal( database.pool.options.ssl, true ); + }); +}); \ No newline at end of file diff --git a/test/dirac.js b/test/dirac.js new file mode 100644 index 0000000..cddf4c1 --- /dev/null +++ b/test/dirac.js @@ -0,0 +1,26 @@ +const assert = require('assert') +const dirac = require('../') +const Table = require('../lib/table') + +describe('Dirac', () => { + it('Should allow for tables with middleware', () => { + let db = dirac() + + const fooTable = new Table({ + name: 'foo' + , schema: { + id: { type: 'serial', primaryKey: true } + } + }) + .before(function MyQueryTransform( query ){ + return query + }) + + db = db.register( fooTable ) + + const query = db.foo.query() + + assert.equal(query.queryTransforms.length, 2) + assert.equal(query.queryTransforms[0].handler.name, 'MyQueryTransform') + }) +}) \ No newline at end of file diff --git a/test/middleware.js b/test/middleware.js deleted file mode 100644 index 134db4b..0000000 --- a/test/middleware.js +++ /dev/null @@ -1,475 +0,0 @@ -var _ = require('lodash'); -var async = require('async'); -var dirac = require('../'); -var assert = require('assert'); - -describe ('Middleware', function(){ - describe('Cast To JSON', function(){ - beforeEach( function( done ){ - dirac.destroy(); - - dirac.use( dirac.castToJSON() ); - - dirac.register({ - name: 'test_a' - , schema: { - test_field: { - type: 'json' - } - } - }); - - dirac.init({ database: 'dirac_cast_to_json_test'}); - - done(); - }); - - it('should cast before insert', function( done ){ - dirac.dals.test_a.before('insert', function( $query, schema, next ){ - assert.equal( - typeof $query.values.test_field - , 'string' - ); - - done(); - }); - - dirac.dals.test_a.insert({ - test_field: {} - }); - }); - - it('should cast before update', function( done ){ - dirac.dals.test_a.before('update', function( $query, schema, next ){ - assert.equal( - typeof $query.updates.test_field - , 'string' - ); - - done(); - }); - - dirac.dals.test_a.update({}, { - test_field: {} - }); - }); - }); - - describe ('Table References', function(){ - beforeEach( function(){ - dirac.destroy(); - dirac.use( dirac.tableRef() ); - }); - - it ('should add column refs', function(){ - dirac.register({ - name: 'groups' - , schema: { - id: { type: 'serial', primaryKey: true } - , name: { type: 'text' } - , uid: { type: 'integer', references: { table: 'users' } } - } - }); - - dirac.init({ connString: 'postgres://localhost/db_does_not_matter' }); - - assert( dirac.dals.groups.schema.uid.references.column === 'id' ); - }); - }); - - describe ('Embeds', function(){ - beforeEach( function(){ - dirac.destroy(); - dirac.use( dirac.embeds() ); - }); - - it ('should embed groups', function( done ){ - dirac.register({ - name: 'users' - , schema: { - id: { type: 'serial', primaryKey: true } - , email: { type: 'text' } - } - - , defaultEmbeds: { - groups: true - } - - , embeds: { - groups: function( results, $query, callback ){ - if ( results.length === 0 ) return callback(); - dirac.dals.groups.find({ uid: results[0].id }, callback ); - } - } - }); - - dirac.register({ - name: 'groups' - , schema: { - id: { type: 'serial', primaryKey: true } - , uid: { type: 'integer', references: { table: 'users' } } - , name: { type: 'text' } - } - }); - - dirac.init({ connString: 'postgres://localhost/dirac_test' }); - - dirac.sync({ force: true }, function(){ - async.waterfall([ - function( cb ){ - dirac.dals.users.insert( { email: 'blah' }, cb ); - } - , function( results, cb ){ - dirac.dals.groups.insert({ name: 'test', uid: results[0].id }, cb ); - } - ], function( error ){ - assert( !error ); - - dirac.dals.users.findOne( 1, function( error, user ){ - assert( !error ); - assert( Array.isArray( user.groups ) ); - assert( user.groups.length === 1 ); - assert( user.groups[0].name === 'test' ); - done(); - }); - }); - }); - }); - }); - - describe('Directory', function(){ - beforeEach( function(){ - dirac.destroy(); - }); - - it ('should use a directory for dal registration', function(){ - dirac.use( dirac.dir( __dirname + '/test-dals' ) ); - dirac.init({ connString: 'postgres://localhost/dirac_test' }); - assert( dirac.dals.test_tbl instanceof dirac.DAL ); - }); - }); - - describe( 'Relationships', function(){ - beforeEach( function(){ - dirac.destroy(); - }); - - it( 'Should describe a one-to-many relationship', function( done ){ - dirac.use( dirac.relationships() ); - - dirac.register({ - name: 'users' - , schema: { - id: { type: 'serial', primaryKey: true } - , email: { type: 'text' } - } - }); - - dirac.register({ - name: 'groups' - , schema: { - id: { type: 'serial', primaryKey: true } - , uid: { type: 'integer', references: { table: 'users', column: 'id' } } - , name: { type: 'text' } - } - }); - - dirac.init({ connString: 'postgres://localhost/dirac_test' }); - - async.waterfall([ - dirac.sync.bind( dirac, { force: true } ) - , function( next ){ - dirac.dals.users.insert( { email: 'poop@poop.com' }, function( error, user ){ - assert( !error, error ); - - user = user[0]; - - assert( user.id ); - - next( null, user ); - }); - } - - // Insert some other users to ensure we're not screwing this up - , function( user, next ){ - dirac.dals.users.insert( { email: 'poop2@poop.com' }, function( error, user2 ){ - assert( !error, error ); - - user2 = user2[0]; - - assert( user2.id ); - - next( null, user, user2 ); - }); - } - - , function( user, user2, next ){ - var groups = [ - { uid: user.id, name: 'client '} - , { uid: user.id, name: 'test-123 '} - ]; - - dirac.dals.groups.insert( groups.concat({ uid: user2.id, name: 'client' }), function( error ){ - return next( error, user, groups ); - }); - } - - , function( user, groups, next ){ - dirac.dals.users.findOne( user.id, { many: [{ table: 'groups' }] }, function( error, user ){ - assert( !error, error ); - - assert( Array.isArray( user.groups ), 'user.groups is ' + typeof user.groups ); - groups = groups.map( function( g ){ - return g.name; - }); - - user.groups.map( function( g ){ - return g.name; - }).forEach( function( g ){ - assert( groups.indexOf( g ) > -1, g + ' not in original groups' ); - }); - - next(); - }); - } - ], done ); - }); - - it( 'Should describe a one-to-one relationship', function( done ){ - dirac.use( dirac.relationships() ); - - dirac.register({ - name: 'users' - , schema: { - id: { type: 'serial', primaryKey: true } - , email: { type: 'text' } - } - }); - - dirac.register({ - name: 'extension' - , schema: { - id: { type: 'serial', primaryKey: true } - , uid: { type: 'integer', references: { table: 'users', column: 'id' } } - , name: { type: 'text' } - } - }); - - var EXTENSION_NAME = 'Blah'; - - dirac.init({ connString: 'postgres://localhost/dirac_test' }); - - async.waterfall([ - dirac.sync.bind( dirac, { force: true } ) - , function( next ){ - dirac.dals.users.insert( { email: 'poop@poop.com' }, function( error, user ){ - assert( !error, error ); - - user = user[0]; - - assert( user.id ); - - next( null, user ); - }); - } - - , function( user, next ){ - dirac.dals.extension.insert( { uid: user.id, name: EXTENSION_NAME }, function( error ){ - return next( error, user ); - }); - } - - , function( user, next ){ - dirac.dals.users.findOne( user.id, { one: [{ table: 'extension' }] }, function( error, user ){ - assert( !error, error ); - - assert.equal( user.extension.name, EXTENSION_NAME ); - - next(); - }); - } - ], done ); - }); - - it( 'Should describe a one-to-many relationship, but pluck a column', function( done ){ - dirac.use( dirac.relationships() ); - - dirac.register({ - name: 'users' - , schema: { - id: { type: 'serial', primaryKey: true } - , email: { type: 'text' } - } - }); - - dirac.register({ - name: 'groups' - , schema: { - id: { type: 'serial', primaryKey: true } - , uid: { type: 'integer', references: { table: 'users', column: 'id' } } - , name: { type: 'text' } - } - }); - - dirac.init({ connString: 'postgres://localhost/dirac_test' }); - - async.waterfall([ - dirac.sync.bind( dirac, { force: true } ) - , function( next ){ - dirac.dals.users.insert( { email: 'poop@poop.com' }, function( error, user ){ - assert( !error, error ); - - user = user[0]; - - assert( user.id ); - - next( null, user ); - }); - } - - , function( user, next ){ - var groups = [ - { uid: user.id, name: 'client-blah '} - , { uid: user.id, name: 'test-1234'} - ]; - - dirac.dals.groups.insert( groups, function( error ){ - return next( error, user, groups ); - }); - } - - , function( user, groups, next ){ - dirac.dals.users.findOne( user.id, { pluck: [{ table: 'groups', column: 'name' }] }, function( error, user ){ - assert( !error, error ); - - assert( Array.isArray( user.groups ), 'user.groups is ' + typeof user.groups ); - groups = groups.map( function( g ){ - return g.name; - }); - - user.groups.forEach( function( g ){ - assert( groups.indexOf( g ) > -1, g + ' not in original groups' ); - }); - - next(); - }); - } - ], done ); - }); - - it( 'Should describe a mixin relationship', function( done ){ - dirac.use( dirac.relationships() ); - - dirac.register({ - name: 'users' - , schema: { - id: { type: 'serial', primaryKey: true } - , email: { type: 'text' } - } - }); - - dirac.register({ - name: 'orders' - , schema: { - id: { type: 'serial', primaryKey: true } - , uid: { type: 'integer', references: { table: 'users', column: 'id' } } - , foo: { type: 'text' } - } - }); - - dirac.register({ - name: 'invoices' - , schema: { - id: { type: 'serial', primaryKey: true } - , uid: { type: 'int', references: { table: 'users', column: 'id' } } - } - }); - - dirac.register({ - name: 'invoice_orders' - , schema: { - id: { type: 'serial', primaryKey: true } - , iid: { type: 'integer', references: { table: 'invoices', column: 'id' } } - , oid: { type: 'integer', references: { table: 'orders', column: 'id' } } - } - }); - - dirac.init({ connString: 'postgres://localhost/dirac_test' }); - - async.waterfall([ - dirac.sync.bind( dirac, { force: true } ) - // Insert some noise in our data - , dirac.dals.users.insert.bind( dirac.dals.users, { email: 'test1@test.com '} ) - , function( user, next ){ - return dirac.dals.orders.insert( { uid: user.id, foo: 'bar' }, next ); - } - , function( order, next ){ - dirac.dals.users.insert( { email: 'poop@poop.com' }, function( error, user ){ - assert( !error, error ); - - user = user[0]; - - assert( user.id ); - - next( null, user ); - }); - } - - , function( user, next ){ - var orders = new Array(2) - .join() - .split(',') - .map( _.identity.bind( null, { uid: user.id, foo: 'bar' }) ); - - dirac.dals.orders.insert( orders, function( error, results ){ - return next( error, user, results ); - }); - } - - , function( user, orders, next ){ - var doc = { - uid: user.id - }; - - dirac.dals.invoices.insert( doc, function( error, invoice ){ - return next( error, user, orders, invoice[0] ); - }); - } - - , function( user, orders, invoice, next ){ - var uios = orders.map( function( order ){ - return { oid: order.id, iid: invoice.id }; - }); - - dirac.dals.invoice_orders.insert( uios, function( error, results ){ - return next( error, user, orders, invoice, results ); - }); - } - - , function( user, orders, invoice, uios, next ){ - dirac.dals.invoices.findOne( invoice.id, { - many: [ { table: 'invoice_orders' - , alias: 'orders' - , mixin: [{ table: 'orders', columns: ['id', 'uid']}] - } - ] - }, function( error, result ){ - if ( error ) return next( error ); - - assert.equal( result.id, invoice.id ); - assert.equal( result.orders.length, uios.length ); - - uios.forEach( function( uio, i ){ - assert.equal( result.orders[ i ].iid, uio.iid ); - assert.equal( result.orders[ i ].oid, uio.oid ); - assert.equal( result.orders[ i ].uid, user.id ); - assert.equal( result.orders[ i ].foo, undefined ); - }); - - return next(); - }); - } - ], done ); - }); - }); -}); \ No newline at end of file diff --git a/test/query-generator.js b/test/query-generator.js new file mode 100644 index 0000000..8d62622 --- /dev/null +++ b/test/query-generator.js @@ -0,0 +1,179 @@ +const assert = require('assert'); +const mosql = require('mongo-sql'); +const PGPool = require('pg-pool'); +const QueryGenerator = require('../lib/query-generator'); +const Query = require('../lib/query'); +const Immutable = require('../lib/immutable'); +const QueryTransform = require('../lib/query-transform'); +const ResultTransform = require('../lib/result-transform'); + +describe('QueryGenerator', ()=>{ + var pool = new PGPool(); + + it('constructor()', ()=>{ + var generator = new QueryGenerator({ pool }); + + assert( generator instanceof Immutable ); + + assert.deepEqual( generator.queryTransforms, [] ); + assert.deepEqual( generator.resultsTransforms, [] ); + }); + + it('.getCreateQueryOptions()', ()=>{ + var generator = new QueryGenerator({ pool }); + var options = generator.getCreateQueryOptions(); + + assert( 'mosql' in options ); + assert.deepEqual( options.queryTransforms, [] ); + assert.deepEqual( options.resultsTransforms, [] ); + }); + + it('.query( query )', ()=>{ + var generator = new QueryGenerator({ pool }); + var query1 = generator.query(); + + assert( query1 instanceof Query ); + + generator = new QueryGenerator({ pool }); + query1 = generator.query(); + assert.equal( query1.pool, generator.pool ); + }); + + it('.clone()', ()=>{ + var generator1 = new QueryGenerator({ pool }) + .before( query => query.where('foo', 'bar') ) + .after( results => results[0] ); + + var generator2 = generator1.clone().after( result => 1 ); + + // Adding props to gen2 doesn't affect gen1 + assert.equal( generator1.resultsTransforms.length, 1 ); + + assert.equal( generator2.queryTransforms.length, 1 ); + assert.equal( generator2.resultsTransforms.length, 2 ); + }); + + it('.use( middleware )', ()=>{ + var generator1 = new QueryGenerator({ pool }) + .use( QueryTransform.create( query => query ) ); + + assert.equal( generator1.queryTransforms.length, 1 ); + + var generator2 = generator1 + .use( QueryTransform.create( query => query ) ); + + assert.equal( generator1.queryTransforms.length, 1 ); + assert.equal( generator2.queryTransforms.length, 2 ); + + generator1 = generator1.use( ResultTransform.create( results => results ) ); + + assert.equal( generator1.resultsTransforms.length, 1 ); + + var generator3 = new QueryGenerator({ pool }) + .use( gen => gen.mutate( gen => { + gen.before( query => query.where('foo', 'bar') ); + gen.after( results => results[0] ); + })); + + assert.equal( generator3.queryTransforms.length, 1 ); + assert.equal( generator3.resultsTransforms.length, 1 ); + + assert.throws( ()=>{ + generator1.use({}); + }, QueryGenerator.InvalidTransformError ); + }); + + it('.before( transform )', ()=>{ + var generator1 = new QueryGenerator({ pool }) + .before( query => query.where('foo', 'bar') ) + .before( query => query.where('bar.baz', 'foo') ); + + var query1 = generator1.query(); + assert.equal( query1.queryTransforms.length, 2 ); + + var transformed = query1.getTransformedQuery(); + + assert.equal( transformed.queryTransforms.length, 0 ); + + assert.deepEqual( transformed.mosqlQuery.where, { + foo: 'bar' + , bar: { baz: 'foo' } + }); + + var generator2 = generator1 + .before( query => { + return query.clone().mutate(query => { + query.mosqlQuery.where = { byah: true } + }) + }) + + var query2 = generator2.query() + + assert.deepEqual( query2.getTransformedQuery().mosqlQuery.where, { + byah: true + }); + }); + + it('.before( transform[] )', ()=>{ + var generator1 = new QueryGenerator({ pool }) + .before([ + query => query.where('foo', 'bar') + , query => query.where('bar.baz', 'foo') + ]); + + var query1 = generator1.query(); + + assert.equal( query1.queryTransforms.length, 2 ); + + var transformed = query1.getTransformedQuery(); + + assert.equal( transformed.queryTransforms.length, 0 ); + + assert.deepEqual( transformed.mosqlQuery.where, { + foo: 'bar' + , bar: { baz: 'foo' } + }); + }); + + it('.after( transform )', ()=>{ + var generator1 = new QueryGenerator({ pool }) + .after( results => results[0] ) + .after( result => { + return { foo: result }; + }); + + var query1 = generator1.query(); + + assert.equal( query1.resultsTransforms.length, 2 ); + + var result = query1.getTransformedResult([ + { a: 1, b: 2 } + , { a: 2, b: 3 } + ]); + + assert.deepEqual( result, { + foo: { a: 1, b: 2 } + }); + }); + + it('.after( transform[] )', ()=>{ + var generator1 = new QueryGenerator({ pool }) + .after([ + results => results[0] + , result => { return { foo: result }; } + ]); + + var query1 = generator1.query(); + + assert.equal( query1.resultsTransforms.length, 2 ); + + var result = query1.getTransformedResult([ + { a: 1, b: 2 } + , { a: 2, b: 3 } + ]); + + assert.deepEqual( result, { + foo: { a: 1, b: 2 } + }); + }); +}); \ No newline at end of file diff --git a/test/query.js b/test/query.js new file mode 100644 index 0000000..e9bf243 --- /dev/null +++ b/test/query.js @@ -0,0 +1,238 @@ +var assert = require('assert'); +var mosql = require('mongo-sql'); +var PGPool = require('pg-pool'); +var QueryOriginal = require('../lib/query'); +var QueryTransform = require('../lib/query-transform'); + +var pool = new PGPool(); + +class Query extends QueryOriginal { + static create( query, options ){ + return new Query( query, options ); + } + + constructor( query, options = {} ){ + options.pool = pool; + super( query, options ); + } +} + +describe('Query', ()=>{ + it('constructor( query, options )', ()=>{ + var query1 = new Query(); + + assert.deepEqual( query1.mosqlQuery, {} ); + assert( query1.options.immutable ); + + var query2 = new Query({ type: 'select' }); + + assert.deepEqual( query2.mosqlQuery, { type: 'select' } ); + assert( query2.options.immutable ); + assert( query1 !== query2 ); + }); + + it('Query.create( query, options )', ()=>{ + var query1 = Query.create(); + + assert.deepEqual( query1.mosqlQuery, {} ); + assert( query1.options.immutable ); + + var query2 = Query.create({ type: 'select' }); + + assert.deepEqual( query2.mosqlQuery, { type: 'select' } ); + assert( query2.options.immutable ); + assert( query1 !== query2 ); + }); + + it('.clone()', ()=>{ + var query1 = Query.create({ + type: 'select' + , table: 'users' + }); + + var query2 = query1.clone(); + + assert( query1 !== query2 ); + + assert.deepEqual( query1.mosqlQuery, { + type: 'select' + , table: 'users' + }); + }); + + it('.instance()', ()=>{ + var query1 = Query.create(); + var query2 = query1.instance(); + + // Clones because is immutable + assert( query1 !== query2 ); + + query1 = Query.create( {}, { immutable: false }); + query2 = query1.instance(); + + // Does not clone because is not immutable + assert( query1 === query2 ); + }); + + it('.table()', ()=>{ + var query1 = Query.create({ type: 'select', table: 'users' }); + + assert.equal( query1.table(), 'users' ); + }); + + it('.table( table )', ()=>{ + var query1 = Query.create(); + var query2 = query1.table('users') + + assert.equal( query1.mosqlQuery.table, undefined ); + assert.equal( query2.mosqlQuery.table, 'users' ); + }); + + it('.type()', ()=>{ + var query1 = Query.create({ type: 'select', table: 'users' }); + + assert.equal( query1.type(), 'select' ); + }); + + it('.type( type )', ()=>{ + var query1 = Query.create(); + var query2 = query1.type('select') + + assert.equal( query1.mosqlQuery.type, undefined ); + assert.equal( query2.mosqlQuery.type, 'select' ); + }); + + it('.where()', ()=>{ + var query1 = Query + .create({ type: 'select', table: 'users', where: { foo: 'bar' } }); + + assert.deepEqual( query1.where(), { foo: 'bar' } ); + }); + + it('.where( key )', ()=>{ + var query1 = Query.create({ + type: 'select' + , table: 'users' + , where: { foo: 'bar' } + }); + + assert.equal( query1.where('foo'), 'bar' ); + + var query2 = Query.create({ + type: 'select' + , table: 'users' + , where: { foo: { bar: { baz: 'foo' } } } + }); + + assert.equal( query2.where('foo.bar.baz'), 'foo' ); + }); + + it('.where( obj )', ()=>{ + var query1 = Query + .create({ type: 'select', table: 'users' }) + .where({ foo: 'bar' }); + + assert.deepEqual( query1.mosqlQuery, { + type: 'select' + , table: 'users' + , where: { foo: 'bar' } + }); + + var query2 = query1.where({ bar: 'baz' }); + + assert( query1 !== query2 ); + + assert.deepEqual( query1.mosqlQuery, { + type: 'select' + , table: 'users' + , where: { foo: 'bar' } + }); + + assert.deepEqual( query2.mosqlQuery, { + type: 'select' + , table: 'users' + , where: { foo: 'bar', bar: 'baz' } + }); + }); + + it('.where( key, val )', ()=>{ + var query1 = Query + .create({ type: 'select', table: 'users' }) + .where( 'foo', 'bar' ); + + assert.deepEqual( query1.mosqlQuery, { + type: 'select' + , table: 'users' + , where: { foo: 'bar' } + }); + + var query2 = query1.where( 'bar', 'baz' ); + + assert( query1 !== query2 ); + + assert.deepEqual( query1.mosqlQuery, { + type: 'select' + , table: 'users' + , where: { foo: 'bar' } + }); + + assert.deepEqual( query2.mosqlQuery, { + type: 'select' + , table: 'users' + , where: { foo: 'bar', bar: 'baz' } + }); + }); + + it('.columns([column])', ()=>{ + var query1 = Query + .create({ type: 'select', table: 'users' }); + + assert.equal( query1.columns(), undefined ); + + var query2 = query1.columns('foo'); + + assert.equal( query1.columns(), undefined ); + assert.deepEqual( query2.columns(), ['foo'] ); + }); + + it('.returning([column])', ()=>{ + var query1 = Query + .create({ type: 'select', table: 'users' }); + + assert.equal( query1.returning(), undefined ); + + var query2 = query1.returning('foo'); + + assert.equal( query1.returning(), undefined ); + assert.deepEqual( query2.returning(), ['foo'] ); + }); + + it('.joins([join])', ()=>{ + var query1 = Query + .create({ type: 'select', table: 'users' }); + + assert.equal( query1.joins(), undefined ); + + var query2 = query1.joins({ + target: 'user_books' + , on: { user_id: '$users.id$' } + }); + + assert.equal( query1.joins(), undefined ); + assert.deepEqual( query2.joins(), [ + { type: 'left', target: 'user_books', on: { user_id: '$users.id$' } } + ]); + }); + + it('.toStringAndValues()', ()=>{ + var query1 = Query.create({ + type: 'select' + , table: 'users' + }, { mosql }); + + var result = query1.toStringAndValues(); + + assert.equal( result.query, 'select "users".* from "users"' ); + assert.deepEqual( result.values, [] ); + }); +}); \ No newline at end of file diff --git a/test/relationships.js b/test/relationships.js new file mode 100644 index 0000000..4a409e6 --- /dev/null +++ b/test/relationships.js @@ -0,0 +1,432 @@ +var assert = require('assert'); +var PGPool = require('pg-pool'); +var Database = require('../lib/database'); +var Relationships = require('../middleware/relationships'); + +var pool = new PGPool(); + +describe('Relationships', ()=>{ + it('database.use( relationships )', ()=>{ + var db = Database.create() + .use( Relationships() ) + .register('users'); + + assert.equal( db.queryTransforms.length, 1 ); + assert.equal( db.Query, Relationships.Query ); + assert.equal( db.users.Query, Relationships.Query ); + + var query = db.query(); + + assert( query instanceof Relationships.Query ); + assert.equal( query.queryTransforms.length, 1 ); + }); + + describe('Relationships.Query', ()=>{ + it('.one(...)', ()=>{ + var query = Relationships.Query.create( {}, { pool } ) + .one({ table: 'users' }); + + assert.deepEqual( query.mosqlQuery.one, [ + { table: 'users' } + ]); + + var query2 = query.one('orders'); + + assert.deepEqual( query.mosqlQuery.one, [ + { table: 'users'} + ]); + + assert.deepEqual( query2.mosqlQuery.one, [ + { table: 'users' } + , { table: 'orders' } + ]); + }); + + it('.many(...)', ()=>{ + var query = Relationships.Query.create( {}, { pool } ) + .many({ table: 'users' }); + + assert.deepEqual( query.mosqlQuery.many, [ + { table: 'users' } + ]); + + var query2 = query.many('orders'); + + assert.deepEqual( query.mosqlQuery.many, [ + { table: 'users'} + ]); + + assert.deepEqual( query2.mosqlQuery.many, [ + { table: 'users' } + , { table: 'orders' } + ]); + }); + + it('.mixin(...)', ()=>{ + var query = Relationships.Query.create( {}, { pool } ) + .mixin({ table: 'users' }); + + assert.deepEqual( query.mosqlQuery.mixin, [ + { table: 'users' } + ]); + + var query2 = query.mixin('orders'); + + assert.deepEqual( query.mosqlQuery.mixin, [ + { table: 'users'} + ]); + + assert.deepEqual( query2.mosqlQuery.mixin, [ + { table: 'users' } + , { table: 'orders' } + ]); + + var query3 = Relationships.Query.create( {}, { pool } ) + .mixin({ + table: 'users' + , mixin: [{ table: 'poops' }] + }); + + assert.deepEqual( query3.mosqlQuery.mixin, [ + { table: 'users', mixin: [{ table: 'poops' }] } + ]); + }); + + it('.pluck(...)', ()=>{ + var query = Relationships.Query.create( {}, { pool } ) + .pluck({ table: 'users' }); + + assert.deepEqual( query.mosqlQuery.pluck, [ + { table: 'users' } + ]); + + var query2 = query.pluck('orders'); + + assert.deepEqual( query.mosqlQuery.pluck, [ + { table: 'users'} + ]); + + assert.deepEqual( query2.mosqlQuery.pluck, [ + { table: 'users' } + , { table: 'orders' } + ]); + }); + }); + + it('transforms many', ()=>{ + var db = Database.create() + .use( Relationships() ) + .register({ + name: 'users' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + }) + .register({ + name: 'books' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + }) + .register({ + name: 'user_books' + , schema: { + user_id: { type: 'int', references: { table: 'users', column: 'id' } } + , book_id: { type: 'int', references: { table: 'books', column: 'id' } } + } + }); + + var query = db.users.find().many('user_books'); + + assert.equal( query.queryTransforms.length, 1 ); + assert.deepEqual( query.getTransformedQuery().mosqlQuery, { + "type": "select", + "table": "users", + "many": [ + { + "table": "user_books" + } + ], + "columns": [ + "*", + { + "type": "expression", + "alias": "user_books", + "expression": { + "parenthesis": true, + "expression": { + "type": "array_to_json", + "expression": { + "type": "array", + "expression": { + "type": "select", + "columns": [ + { + "type": "row_to_json", + "expression": "r" + } + ], + "table": { + "type": "select", + "table": "user_books", + "where": { + "user_id": "$\"users\".\"id\"$" + }, + "alias": "r", + "qAlias": "r" + } + } + } + } + } + } + ] + }); + }); + + it('transforms one', ()=>{ + var db = Database.create() + .use( Relationships() ) + .register({ + name: 'users' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + }) + .register({ + name: 'books' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + }) + .register({ + name: 'user_books' + , schema: { + user_id: { type: 'int', references: { table: 'users', column: 'id' } } + , book_id: { type: 'int', references: { table: 'books', column: 'id' } } + } + }); + + var query = db.user_books.find().one('users'); + + assert.equal( query.queryTransforms.length, 1 ); + + assert.deepEqual( query.getTransformedQuery().mosqlQuery, { + "type": "select", + "table": "user_books", + "one": [ + { + "table": "users" + } + ], + "columns": [ + "*", + { + "type": "expression", + "alias": "users", + "expression": { + "parenthesis": true, + "expression": { + "type": "select", + "columns": [ + { + "type": "row_to_json", + "expression": "r" + } + ], + "table": { + "type": "select", + "table": "users", + "alias": "r", + "where": { + "id": "$\"user_books\".\"user_id\"$" + }, + "limit": 1, + "qAlias": "r" + } + } + } + } + ] + }); + }); + + it('transforms mixin', ()=>{ + var db = Database.create() + .use( Relationships() ) + .register({ + name: 'users' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + }) + .register({ + name: 'books' + , schema: { + id: { type: 'serial', primaryKey: true } + , author_id: { type: 'int', references: { table: 'users', column: 'id' } } + , name: { type: 'text' } + } + }) + .register({ + name: 'user_books' + , schema: { + user_id: { type: 'int', references: { table: 'users', column: 'id' } } + , book_id: { type: 'int', references: { table: 'books', column: 'id' } } + } + }); + + var query = db.user_books.find().mixin('books'); + + assert.equal( query.queryTransforms.length, 1 ); + + assert.deepEqual( query.getTransformedQuery().mosqlQuery, { + "type": "select", + "table": "user_books", + "mixin": [ + { + "table": "books" + } + ], + "joins": [ + { + "type": "left", + "target": "books", + "alias": undefined, + "on": { + "id": "$\"user_books\".\"book_id\"$" + } + } + ], + "columns": [ + "*", + { + "table": "books", + "name": "*" + } + ] + }); + + var query2 = db.user_books.find() + .mixin({ + table: 'books' + , mixin: [{ table: 'users' }] + }); + + assert.equal( query2.queryTransforms.length, 1 ); + + assert.deepEqual( query2.getTransformedQuery().mosqlQuery, { + "type": "select", + "table": "user_books", + "mixin": [ + { + "table": "books", + "mixin": [{ table: 'users' }] + } + ], + "joins": [ + { + "type": "left", + "target": "books", + "alias": undefined, + "on": { + "id": "$\"user_books\".\"book_id\"$" + } + }, + { + "type": "left", + "target": "users", + "alias": undefined, + "on": { + id: '$"books"."author_id"$' + } + } + ], + "columns": [ + "*", + { + "table": "books", + "name": "*" + }, + { + "table": "users", + "name": "*" + } + ] + }); + }); + + it('transforms pluck', ()=>{ + var db = Database.create() + .use( Relationships() ) + .register({ + name: 'users' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + }) + .register({ + name: 'books' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + }) + .register({ + name: 'user_books' + , schema: { + user_id: { type: 'int', references: { table: 'users', column: 'id' } } + , book_id: { type: 'int', references: { table: 'books', column: 'id' } } + } + }); + + var query = db.users.find().pluck({ table: 'user_books', column: 'book_id' }); + + assert.equal( query.queryTransforms.length, 1 ); + + assert.deepEqual( query.getTransformedQuery().mosqlQuery, { + "type": "select", + "table": "users", + "pluck": [ + { + "table": "user_books", + "column": "book_id" + } + ], + "columns": [ + "*", + { + "type": "expression", + "alias": "user_books", + "expression": { + "parenthesis": true, + "expression": { + "type": "array", + "expression": { + "type": "select", + "columns": [ + "book_id" + ], + "table": { + "type": "select", + "table": "user_books", + "where": { + "user_id": "$\"users\".\"id\"$" + }, + "alias": "r", + "qAlias": "r" + } + } + } + } + } + ] + }); + }); +}); \ No newline at end of file diff --git a/test/table.js b/test/table.js new file mode 100644 index 0000000..f3181d1 --- /dev/null +++ b/test/table.js @@ -0,0 +1,269 @@ +var assert = require('assert'); +var PGPool = require('pg-pool'); +var TableOriginal = require('../lib/table'); +var Query = require('../lib/query'); + +var pool = new PGPool(); + +class Table extends TableOriginal { + static create( options ){ + return new Table( options ); + } + + constructor( options = {} ){ + options.pool = pool; + super( options ); + } +} + +describe('Table', ()=>{ + it('Table.create()', ()=>{ + var table = Table.create({ + name: 'foo' + }); + + assert.equal( table.name, 'foo' ); + }); + + it('.getPrimaryKey()', ()=>{ + var table1 = Table.create({ name: 'foo' }); + + assert.equal( table1.getPrimaryKey(), 'id' ); + + var table2 = Table.create({ + name: 'bar' + , schema: { uuid: { type: 'uuid', primaryKey: true } } + }); + + assert.equal( table2.getPrimaryKey(), 'uuid' ); + }); + + it('.getIdParamWhereClause( where )', ()=>{ + var table1 = Table.create({ name: 'foo' }); + + assert.deepEqual( table1.getIdParamWhereClause(12), { id: 12 }) + assert.deepEqual( table1.getIdParamWhereClause({ foo: 'bar' }), { foo: 'bar' }) + + var table2 = Table.create({ + name: 'bar' + , schema: { uuid: { type: 'uuid', primaryKey: true } } + }); + + assert.deepEqual( table2.getIdParamWhereClause('byah'), { uuid: 'byah' }); + }); + + it('.find([{condition}[, {options}]])', ()=>{ + var table = Table.create({ + name: 'foo' + }); + + var query = table.find(); + + assert.equal( query.mosqlQuery.type, 'select' ); + assert.equal( query.mosqlQuery.table, 'foo' ); + + query = table.find({ foo: 'bar' }); + + assert.deepEqual( query.mosqlQuery.where, { foo: 'bar' } ); + + query = table.find({ foo: 'bar' }, { limit: 10 }); + + assert.deepEqual( query.mosqlQuery, { + type: 'select' + , table: 'foo' + , where: { foo: 'bar' } + , limit: 10 + }); + }); + + it('.findOne(id)', ()=>{ + var table = Table.create({ + name: 'foo' + }); + + var query = table.findOne(100); + + assert.equal( query.mosqlQuery.type, 'select' ); + assert.equal( query.mosqlQuery.table, 'foo' ); + assert.equal( query.mosqlQuery.where.id, 100 ); + assert.deepEqual( query.getTransformedResult([ { foo: 'bar' }, { baz: 'bar' } ] ), { + foo: 'bar' + }); + }); + + it('.findOne({condition})', ()=>{ + var table = Table.create({ + name: 'foo' + }); + + var query = table.findOne({ bar: 'baz' }); + + assert.equal( query.mosqlQuery.type, 'select' ); + assert.equal( query.mosqlQuery.table, 'foo' ); + assert.equal( query.mosqlQuery.where.bar, 'baz' ); + }); + + it('.findOne(uuid) using primaryKey', ()=>{ + var table = Table.create({ + name: 'foo' + , schema: { + uuid: { primaryKey: true } + } + }); + + var query = table.findOne('some_uuid'); + + assert.equal( query.mosqlQuery.type, 'select' ); + assert.equal( query.mosqlQuery.table, 'foo' ); + assert.equal( query.mosqlQuery.where.uuid, 'some_uuid' ); + }); + + it('.insert()', ()=>{ + var table = Table.create({ + name: 'foo' + }); + + var query = table.insert().values({ foo: 'bar' }); + + assert.equal( query.mosqlQuery.type, 'insert' ); + assert.equal( query.mosqlQuery.table, 'foo' ); + assert.deepEqual( query.mosqlQuery.values, { foo: 'bar' } ); + + query = table.insert({ bar: 'baz' }); + + assert.equal( query.mosqlQuery.type, 'insert' ); + assert.equal( query.mosqlQuery.table, 'foo' ); + assert.deepEqual( query.mosqlQuery.values, { bar: 'baz' } ); + assert.deepEqual( query.getTransformedResult([{ bar: 'baz' }]), { bar: 'baz' }) + + query = table.insert([{ bar: 'baz' }, { foo: 'bar' }]); + + assert.equal( query.mosqlQuery.type, 'insert' ); + assert.equal( query.mosqlQuery.table, 'foo' ); + assert.deepEqual( query.mosqlQuery.values, [{ bar: 'baz' }, { foo: 'bar' }] ); + assert.deepEqual( query.getTransformedResult([1, 2]), [1, 2] ); + }); + + xit('.upsert(failingColumn, values)', ()=>{ + var table = Table.create({ + name: 'foo' + }); + + var query = table.upsert('email', { + email: 'foo@bar.com' + , name: 'Foo Bar' + }); + + assert.equal( query.mosqlQuery.type, 'insert' ); + assert.equal( query.mosqlQuery.table, 'foo' ); + assert.deepEqual( query.mosqlQuery.values, { + email: 'foo@bar.com' + , name: 'Foo Bar' + }); + + assert.deepEqual( query.mosqlQuery.conflict, { + target: { column: 'email' } + , action: { update: { email: '$excluded.email$', name: '$excluded.name$' } } + }); + }); + + it('.remove()', ()=>{ + var table = Table.create({ + name: 'foo' + }); + + var query = table.remove(1); + + assert.equal( query.mosqlQuery.type, 'delete' ); + assert.equal( query.mosqlQuery.table, 'foo' ); + assert.equal( query.mosqlQuery.where.id, 1 ); + }); + + it('.update()', ()=>{ + var table = Table.create({ + name: 'foo' + }); + + var query = table.update(1).values({ bar: 'baz' }); + + assert.equal( query.mosqlQuery.type, 'update' ); + assert.equal( query.mosqlQuery.table, 'foo' ); + assert.equal( query.mosqlQuery.where.id, 1 ); + assert.equal( query.mosqlQuery.values.bar, 'baz' ); + }); + + it('.create()', ()=>{ + var table = Table.create({ + name: 'foo' + , schema: { id: { type: 'serial', primarykey: true } } + }); + + var query = table.create(); + + assert.deepEqual( query.mosqlQuery, { + type: 'create-table' + , table: 'foo' + , ifNotExists: true + , definition: { id: { type: 'serial', primarykey: true } } + }) + }); + + it('.create() view', ()=>{ + var table = Table.create({ + name: 'foo' + , type: 'view' + , schema: { id: { type: 'serial', primarykey: true } } + , expression: Query.create( {}, { pool }) + .type('select') + .table('users') + .columns('id') + .where({ id: { $gt: 100} }) + }); + + var query = table.create(); + + assert.deepEqual( query.mosqlQuery, { + type: 'create-view' + , view: 'foo' + , orReplace: true + , materialized: false + , expression: { + type: 'select' + // Artifact from mosql + , __defaultTable: 'users' + , table: 'users' + , columns: ['id'] + , where: { id: { $gt: 100} } + } + }); + }); + + it('Individual query middleware should not affect all queries from table', ()=>{ + var table = Table.create({ + name: 'foo' + , schema: { id: { type: 'serial', primarykey: true } } + }); + + var q1 = table.findOne(123); + + assert.equal( q1.resultsTransforms.length, 1 ); + + var q2 = table.find(); + + assert.equal( q2.resultsTransforms.length, 0 ); + }); + + it('When cloning a table, pool should be passed', ()=>{ + var table = TableOriginal.create({ + name: 'foo' + , schema: { id: { type: 'serial', primarykey: true } } + , pool + }); + + assert.equal( table.pool, pool ); + + var table2 = table.clone() + + assert.equal( table2.pool, pool ); + }); +}); \ No newline at end of file diff --git a/test/test-dals/test-table.js b/test/test-dals/test-table.js deleted file mode 100644 index 38ebbf6..0000000 --- a/test/test-dals/test-table.js +++ /dev/null @@ -1,8 +0,0 @@ -module.exports = { - name: 'test_tbl' - -, schema: { - id: { type: 'int', primaryKey: true } - , createdAt: { type: 'timestamp', default: 'now()' } - } -}; \ No newline at end of file diff --git a/test/test.js b/test/test.js deleted file mode 100644 index 35eebec..0000000 --- a/test/test.js +++ /dev/null @@ -1,1162 +0,0 @@ -var assert = require('assert'); -var pg = require('pg'); -var async = require('async'); -var dirac = require('../'); - -var dbConfig = { - host: 'localhost' -, port: 5432 -, database: 'dirac_test' -}; - -var connString = 'postgres://' + dbConfig.host + ':' + dbConfig.port + '/' + dbConfig.database; - -var destroyCreateDb = function( callback ){ - // Reset dirac in case someone has already used it - dirac.destroy(); - dirac.init( connString.substring( 0, connString.lastIndexOf('/') ) + '/postgres' ); - - dirac.raw( 'drop database if exists dirac_test', function( error ){ - if ( error ) return callback( error ); - dirac.raw( 'create database dirac_test', function( error ){ - if ( error ) return callback( error ); - - // Reset again for future use - dirac.destroy(); - dirac.init( connString ); - callback(); - }); - }); -}; - -var destroyTables = function( callback ){ - dirac.dropAllTables( { forceDirac: true }, function( error ){ - if ( error ) return callback( error ); - - dirac.destroy(); - dirac.init( connString ); - callback(); - }); -}; - -var tableExists = function( table, callback ){ - var query = 'SELECT * FROM pg_catalog.pg_tables where tablename = $1'; - - dirac.raw( query, [ table ], function( error, result ){ - if ( error ) return callback( error ); - callback( null, result.rows.length > 0 ); - }); -}; - -var columnExists = function( table, column, callback ){ - var query = 'select column_name from information_schema.columns where table_name = $1 and column_name = $2'; - - dirac.raw( query, [ table, column ], function( error, result ){ - if ( error ) return callback( error ); - callback( null, result.rows.length > 0 ); - }); -}; - -var hasConstraint = function( table, column, constraint, callback ){ - var query = [ - , 'select 1 from' - , ' information_schema.constraint_column_usage usage' - , 'left join information_schema.table_constraints constraints' - , ' on constraints.constraint_name = usage.constraint_name' - , 'where constraints.constraint_type = $1' - , 'and usage.table_name = $2' - , 'and usage.column_name = $3' - ].join('\n'); - - var values = [ constraint, table, column ]; - - dirac.raw( query, values, function( error, result ){ - if ( error ) return callback( error ); - return callback( null, result.rows.length > 0 ); - }); -}; - -before( function( done ){ - this.timeout(3000) - destroyCreateDb( function( error ){ - if ( error ) throw error; - - done(); - }); -}); - -describe ('Root API', function(){ - - describe ('dirac.init', function(){ - - beforeEach( function(){ - dirac.destroy(); - }); - - it ('should initialize with a connection string', function(){ - dirac.init( connString ); - assert( dirac.options.connString == connString ); - assert( dirac.dals.dirac_schemas instanceof dirac.DAL ); - }); - - it ('should initialize with options', function(){ - dirac.init( dbConfig ); - assert( dirac.options.connString == connString ); - assert( dirac.dals.dirac_schemas instanceof dirac.DAL ); - }); - - it ('should initialize with default options', function(){ - dirac.init({ database: dbConfig.database }); - assert( dirac.options.connString == connString ); - assert( dirac.dals.dirac_schemas instanceof dirac.DAL ); - }); - - it ('should throw an error because missing connection string', function(){ - assert.throws( function(){ - dirac.init(); - }, Error) - }); - - it ('should throw an error because missing host', function(){ - assert.throws( function(){ - dirac.init({}); - }, Error) - }); - - }); - - describe ('dirac.use', function(){ - beforeEach( function(){ - dirac.destroy(); - }); - - it ('should call a function when dirac inits', function(){ - var didCall = false; - var middleware = function( dirac ){ - didCall = true; - // Just make sure we've been passed the dirac object - assert( 'init' in dirac ); - }; - dirac.use( middleware ); - dirac.init( dbConfig ); - assert( didCall ); - }); - - it ('should call a function immediately after dirac inits', function(){ - var didCall = false; - var middleware = function( dirac ){ - didCall = true; - // Just make sure we've been passed the dirac object - assert( 'init' in dirac ); - }; - dirac.init( dbConfig ); - dirac.use( middleware ); - assert( didCall ); - }); - }); - - describe ('dirac.register', function(){ - - it ('should register a new table', function(){ - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - assert( dirac.dals.users instanceof dirac.DAL ); - dirac.unregister( 'users' ); - }); - - it ('should throw an error because the definition is missing', function(){ - assert.throws( function(){ - dirac.register({ - name: 'users' - }); - }); - }); - - }); - - describe ('dirac.unregister', function(){ - - it ('should register a new table', function(){ - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - assert( dirac.dals.users instanceof dirac.DAL ); - dirac.unregister( 'users' ); - assert( !dirac.users ); - }); - - }); - - describe ('dirac.sync', function(){ - - it ('should at least create the dirac_schemas table', function( done ){ - destroyTables( function( error ){ - assert( !error ) - dirac.sync( function( error ){ - assert( !error ); - tableExists( 'dirac_schemas', function( error, result ){ - assert( !error ); - assert( result ); - done(); - }); - }); - }); - }); - - it ('should register a table and sync it', function( done ){ - destroyTables( function( error ){ - assert( !error ) - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - tableExists( 'users', function( error, result ){ - assert( !error ); - assert( result ); - done(); - }); - }); - }); - }); - - it ('should create tables in correct order', function( done ){ - destroyTables( function( error ){ - assert( !error ) - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - dirac.register({ - name: 'groups' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - , user_id: { - type: 'int' - , references: { table: 'users', column: 'id' } - } - } - }); - - dirac.register({ - name: 'other_thing' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - , group_id: { - type: 'int' - , references: { table: 'groups', column: 'id' } - } - } - }); - - dirac.register({ - name: 'other_thing2' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - , user_id: { - type: 'int' - , references: { table: 'users', column: 'id' } - } - , group_id: { - type: 'int' - , references: { table: 'groups', column: 'id' } - } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - - async.series( Object.keys( dirac.dals ).map( function( table ){ - return function( callback ){ - tableExists( table, function( error, result ){ - assert( !error ); - assert( result ); - callback(); - }); - } - }), done ); - }); - }); - }); - - it ('should add a new field', function( done ){ - destroyTables( function( error ){ - assert( !error ) - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - tableExists( 'users', function( error, result ){ - assert( !error ); - assert( result ); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - , email: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - - columnExists( 'users', 'email', function( error, result ){ - assert( !error ); - assert( result ); - done(); - }); - }); - }); - }); - }); - }); - - it ('should add a default', function( done ){ - destroyTables( function( error ){ - assert( !error ) - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , other: { type: 'text' } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - tableExists( 'users', function( error, result ){ - assert( !error ); - assert( result ); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , other: { type: 'text' } - , name: { type: 'text', default: "'poop'" } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - - dirac.dals.users.insert( { other: 'bob' }, { returning: ['*'] }, function( error, result ){ - assert( !error ); - assert( result[0].name == 'poop' ); - done(); - }); - }); - }); - }); - }); - }); - - it ('should add a default but not try to add it again', function( done ){ - destroyTables( function( error ){ - assert( !error ) - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , other: { type: 'text' } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - tableExists( 'users', function( error, result ){ - assert( !error ); - assert( result ); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , other: { type: 'text' } - , name: { type: 'text', default: "'poop'" } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - - dirac.dals.users.insert( { other: 'bob' }, { returning: ['*'] }, function( error, result ){ - assert( !error ); - assert( result[0].name == 'poop' ); - - dirac.sync( function( error ){ - assert( !error ); - done(); - }); - }); - }); - }); - }); - }); - }); - - it ('should add a unique constraint', function( done ){ - destroyTables( function( error ){ - assert( !error ) - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , other: { type: 'text' } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - tableExists( 'users', function( error, result ){ - assert( !error ); - assert( result ); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , other: { type: 'text' } - , name: { type: 'text', unique: true } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - - hasConstraint( 'users', 'name', 'UNIQUE', function( error, result ){ - assert( !error ); - assert( result ); - done(); - }); - }); - }); - }); - }); - }); - - it ('should add a unique constraint and then remove it', function( done ){ - destroyTables( function( error ){ - assert( !error ) - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , other: { type: 'text' } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - hasConstraint( 'users', 'name', 'UNIQUE', function( error, result ){ - assert( !error ); - assert( !result ); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , other: { type: 'text' } - , name: { type: 'text', unique: true } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - - hasConstraint( 'users', 'name', 'UNIQUE', function( error, result ){ - assert( !error ); - assert( result ); - - // Regression test for ensuring we can sync without it trying - // to add the unique twice - dirac.sync( function( error ){ - assert(!error); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , other: { type: 'text' } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - - hasConstraint( 'users', 'name', 'UNIQUE', function( error, result ){ - assert( !error ); - assert( !result ); - done(); - }); - }); - }); - }); - }); - }); - }); - }); - }); - - it ('should add a primary key, then move it to another column', function( done ){ - destroyTables( function( error ){ - assert( !error ) - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , other: { type: 'text' } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - tableExists( 'users', function( error, result ){ - assert( !error ); - assert( result ); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - } - , other: { type: 'text', primaryKey: true } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - - hasConstraint( 'users', 'other', 'PRIMARY KEY', function( error, result ){ - assert( !error ); - assert( result ); - - dirac.register({ - name: 'users' - , schema: { - id: { type: 'serial' } - , other: { type: 'text' } - , name: { type: 'text', primaryKey: true } - } - }); - - dirac.sync( function( error ){ - assert( !error ); - - hasConstraint( 'users', 'name', 'PRIMARY KEY', function( error, result ){ - assert( !error ); - assert( result ); - done(); - }); - }); - }); - }); - }); - }); - }); - }); - - it ('should forcibly sync', function( done ){ - destroyTables( function( error ){ - assert( !error ) - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - dirac.sync( { force: true }, function( error ){ - assert( !error ); - dirac.dals.users.insert({ name: 'Bob' }, function( error, result ){ - assert( !error ); - assert( result ); - - dirac.sync( { force: true }, function( error ){ - assert( !error ); - dirac.dals.users.findOne( {}, function( error, result ){ - assert( !error ); - assert( !result ); - done(); - }); - }); - }); - }) - }); - }); - - it ('should register a view', function( done ){ - destroyTables( function( error ){ - assert( !error ) - - var view; - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - dirac.register( view = { - name: 'bobs' - , type: 'view' - , query: { - type: 'select' - , table: 'users' - , where: { name: { $ilike: 'bob' } } - } - }); - - assert( dirac.views[ view.name ] ); - assert( dirac.dals[ view.name ] ); - - dirac.sync( function( error ){ - assert( !error ); - - async.series( - ['Bob', 'Alice'].map( function( name ){ - return function( _done ){ - dirac.dals.users.insert({ name: name }, _done); - } - }) - , function( error ){ - assert( !error ); - - dirac.dals.bobs.find( {}, function( error, results ){ - assert( !error ); - - assert( results.filter( function( u ){ - return u.name.toLowerCase() == view.query.where.name.$ilike; - }).length, results.length ); - - done(); - }); - } - ); - }); - }); - }); - - it ('should wait to query while syncing', function( done ){ - destroyTables( function( error ){ - assert( !error ) - - var view; - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - dirac.sync(); - - dirac.dals.users.insert({ name: 'Bob' }, function( error ){ - assert( !error ); - done(); - }); - }); - }); - - }); - - describe ('dirac.remove', function() { - before(function(done){ - destroyTables( function( error ){ - if ( error ) return done( error ); - - dirac.destroy(); - dirac.init( connString ); - - dirac.register({ - name: 'happy_meals' - , schema: { - toy_id: { - type: 'int' - } - , meal_id: { - type: 'int' - } - } - }); - - dirac.sync( done ); - }); - }); - - it ('should remove', function( done ){ - dirac.dals.happy_meals.insert({ toy_id: 3, meal_id: 5}, function(err) { - assert(!err); - dirac.dals.happy_meals.remove({ toy_id: 3 }, function(err, results) { - assert(!err); - assert(results.length === 1); - done(); - }); - }); - }); - }); - describe ('dirac.tx', function() { - - before(function(done){ - destroyTables( function( error ){ - if ( error ) return done( error ); - - dirac.destroy(); - dirac.init( connString ); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - dirac.sync( done ); - }); - }); - - it ('should perform transaction', function( done ){ - var tx = dirac.tx.create(); - - tx.begin(function(err) { - assert( !err ); - tx.users.insert({ name: 'red fish' }, function(err) { - assert( !err ); - tx.users.insert({ name: 'blue fish' }, function(err) { - assert( !err ); - tx.commit( function( error ){ - assert(!error); - - dirac.dals.users.find({ name: { $or: [ 'red fish', 'blue fish' ] } }, function( err, users ){ - assert( !err ); - assert.equal( users.length, 2 ); - done(); - }); - }); - }); - }); - }); - }); - - it ('should perform transaction async', function( done ){ - var tx = dirac.tx.create(); - - async.series([ - tx.begin.bind(tx) - , tx.users.insert.bind(tx.users, { name: 'woody' }) - , tx.users.insert.bind(tx.users, { name: 'buzz' }) - ], function(err, results) { - assert( !err ); - - tx.commit(function( err ){ - assert( !err ); - - dirac.dals.users.find({ name: { $or: [ 'woody', 'buzz' ] } }, function( err, users ){ - assert( !err ); - assert.equal( users.length, 2 ); - done(); - }) - }); - }); - }); - - it ('should throw error attempting to query after commit', function( done ){ - var tx = dirac.tx.create(); - - async.series([ - tx.begin.bind(tx) - , tx.users.insert.bind(tx.users, { name: 'red fish' }) - , tx.commit.bind(tx) - ], function(err, results) { - assert( !err ); - assert.throws(tx.users.insert.bind(this, { name: 'blue fish' })); - done(); - }); - }); - - it ('should lock table in a transaction', function( done ){ - var tx = dirac.tx.create(); - async.series([ - dirac.dals.users.insert.bind(dirac.dals.users, { name: 'red fish' }) - , tx.begin.bind(tx) - , tx.users.lock.bind(tx.users, 'ACCESS EXCLUSIVE') - , tx.users.update.bind(tx.users, { name: 'red fish'}, { name: 'blue fish'}) - , tx.commit.bind(tx) - ], function(err){ - assert(!err); - done(); - }); - }); - - it ('should lock table without specifying mode (ACCESS EXCLUSIVE)', function( done ){ - var tx = dirac.tx.create(); - async.series([ - dirac.dals.users.insert.bind(dirac.dals.users, { name: 'red fish' }) - , tx.begin.bind(tx) - , tx.users.lock.bind(tx.users) - , tx.users.update.bind(tx.users, { name: 'red fish'}, { name: 'blue fish'}) - , tx.commit.bind(tx) - ], function(err){ - assert(!err); - done(); - }); - }); - - it ('should fail trying to lock table outside of transaction blocks', function( done ){ - dirac.dals.users.lock(function(err) { - assert(err); - done(); - }) - }); - }); - -}); -/* -describe ('DAL API', function(){ - describe ('DAL.find', function(){ - - var fixtureOptions = { - users: { - numToGenerate: 100 - } - }; - - before(function(done){ - destroyTables( function( error ){ - if ( error ) return done( error ); - - dirac.destroy(); - dirac.init( connString ); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - if ( error ) return done( error ); - - var fns = []; - for ( var i = 1; i <= fixtureOptions.users.numToGenerate; i++ ){ - fns.push(function( callback ){ - dirac.dals.users.insert({ - name: 'User ' + i - }, callback ); - }); - } - - async.series( fns, done ); - }); - }); - }); - - it ('should return all users', function( done ){ - dirac.dals.users.find( {}, function( error, results ){ - assert( !error ); - assert( results.length == fixtureOptions.users.numToGenerate ); - done(); - }); - }); - - it ('add simple where clause', function( done ){ - var $query = { - id: { - $gt: parseInt( fixtureOptions.users.numToGenerate / 2 ) - } - } - dirac.dals.users.find( $query, function( error, results ){ - assert( !error ); - assert( - results.filter( function( result ){ - return result.id > $query.id.$gt - }).length == results.length - ); - done(); - }); - }); - }); - - describe ('DAL.update', function(){ - - before(function(done){ - destroyTables( function( error ){ - if ( error ) return done( error ); - - dirac.destroy(); - dirac.init( connString ); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - if ( error ) return done( error ); - - dirac.dals.users.insert( { name: 'User' }, done ); - }); - }); - }); - - it ('should update', function( done ){ - var $update = { name: 'Bob' }; - dirac.dals.users.update( 1, $update, function( error, results ){ - assert( !error ); - dirac.dals.users.findOne( 1, function( error, result ){ - assert( !error ); - assert( result.name, $update.name ); - done(); - }); - }); - }); - }); - - describe ('DAL.remove', function(){ - - before(function(done){ - destroyTables( function( error ){ - if ( error ) return done( error ); - - dirac.destroy(); - dirac.init( connString ); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - if ( error ) return done( error ); - - dirac.dals.users.insert( { name: 'User' }, done ); - }); - }); - }); - - it ('should remove', function( done ){ - dirac.dals.users.remove( 1, function( error, results ){ - assert( !error ); - dirac.dals.users.findOne( 1, function( error, result ){ - assert( !error ); - assert( !result ); - done(); - }); - }); - }); - }); - - describe ('DAL filters', function(){ - - var fixtureOptions = { - users: { - numToGenerate: 100 - } - }; - - before(function(done){ - destroyTables( function( error ){ - if ( error ) return done( error ); - - dirac.destroy(); - dirac.init( connString ); - - dirac.register({ - name: 'users' - , schema: { - id: { - type: 'serial' - , primaryKey: true - } - , name: { type: 'text' } - } - }); - - dirac.sync( function( error ){ - if ( error ) return done( error ); - - var fns = []; - for ( var i = 1; i <= fixtureOptions.users.numToGenerate; i++ ){ - fns.push(function( callback ){ - dirac.dals.users.insert({ - name: 'User ' + i - }, callback ); - }); - } - - async.series( fns, done ); - }); - }); - }); - - it ('should add a before filter', function(){ - var gotCalled = false; - - dirac.dals.users.before( 'insert', function( $query, schema, next ){ - gotCalled = true; - next(); - }); - - dirac.dals.users.insert({ name: 'Bob' }); - assert( gotCalled ); - }); - - it ('should add an after filter', function( done ){ - var gotCalled = false; - - dirac.dals.users.after( 'find', function( results, $query, schema, next ){ - gotCalled = true; - next(); - }); - - dirac.dals.users.find({ name: 'Bob' }, function( error, result ){ - assert( !error ); - assert( gotCalled ); - done(); - }); - }); - - }); -}); -*/ diff --git a/test/utils.js b/test/utils.js new file mode 100644 index 0000000..d6d8cec --- /dev/null +++ b/test/utils.js @@ -0,0 +1,60 @@ +var assert = require('assert'); +var createTableGraph = require('../lib/table-graph'); + +describe('Utils', ()=>{ + it('createTableGraph( tables )', ()=>{ + var graph = createTableGraph({ + 'users': { + name: 'users' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + } + , 'books': { + name: 'books' + , schema: { + id: { type: 'serial', primaryKey: true } + , name: { type: 'text' } + } + } + , 'user_books': { + name: 'user_books' + , schema: { + user_id: { type: 'int', references: { table: 'users', column: 'id' } } + , book_id: { type: 'int', references: { table: 'books', column: 'id' } } + } + } + }); + + assert.deepEqual( graph, { + "users": { + "dependents": { + "user_books": { + "id": "user_id" + } + }, + "dependencies": {} + }, + "books": { + "dependents": { + "user_books": { + "id": "book_id" + } + }, + "dependencies": {} + }, + "user_books": { + "dependents": {}, + "dependencies": { + "users": { + "user_id": "id" + }, + "books": { + "book_id": "id" + } + } + } + }); + }); +}); \ No newline at end of file