diff --git a/.spelling-wordlist.txt b/.spelling-wordlist.txt index 8fe1841..74452de 100644 --- a/.spelling-wordlist.txt +++ b/.spelling-wordlist.txt @@ -1,5 +1,7 @@ API APIs +BIGINT +Datatypes DBC DBD DBI @@ -11,9 +13,18 @@ hostname ini iODBC jdbc +JSON +ldd +libodbc lockin +Makefile +Nullable +NULLs odbc ODBC +ODBCEnv +ODBCDbc +ODBCStmt ODBC's ORM pony @@ -26,9 +37,17 @@ postgreSQL RDBMS schemas se +SMALLINT +SQLBigInteger +SQLFloat +SQLInteger +SQLReal +SQLSmallInteger SQLState SQLStates +SQLVarchar STMT targeting unixODBC unix +VARCHAR diff --git a/docs/assets/shakespeare-schema.png b/docs/assets/shakespeare-schema.png new file mode 100644 index 0000000..24abeed Binary files /dev/null and b/docs/assets/shakespeare-schema.png differ diff --git a/docs/simple/connecting.md b/docs/simple/connecting.md new file mode 100644 index 0000000..ba6002e --- /dev/null +++ b/docs/simple/connecting.md @@ -0,0 +1,58 @@ +# Connecting To Our Database + +As we mentioned in our overview, before we can create any queries, we need a Statement Handle. In order to get a Statement Handle, we need a Database Handle. In order to get a Database Handle, we need an Environment Handle. + +In this API, these are encapsulated as follows: + +* Environment Handle, ODBCEnv +* Database Handle, ODBCDbc +* Statement Handle, ODBCStmt + +## General Structure of the API + +The vast majority of calls on ODBCEnv, ODBCDbc, and ODBCStmt return `Bool ?`. In other words, these are partial functions. + +Structuring the API in this way means that we can serialize our calls and choose to fail / rollback a transaction if any of them fail. It makes for a clean interface. + +```pony +use "debug" +use "pony-odbc" +use "lib:odbc" + +actor Main + let env: Env + + new create(env': Env) => + env = env' + + let enh: ODBCEnv = ODBCEnv + try + let dbh: ODBCDbc = enh.dbc()? + dbh.connect("psql-demo")? + else + Debug.out("We were enable to create our database handle") + end +``` + +Let's tease this apart: + +First we create our Environment Object: + +```pony + let enh: ODBCEnv = ODBCEnv +``` + +Then we create our Database Object and connect to our database using the DSN we configured previously in our .odbc.ini file: + +```pony + try + let dbh: ODBCDbc = enh.dbc()? + dbh.connect("psql-demo")? + else + Debug.out("We were enable to create our database handle") + end +``` + +Once the `dbh.connect()?` call is complete, we have an authenticated connection to the database instance that we requested. + +Next up - let's create some tables! diff --git a/docs/simple/custom-types.md b/docs/simple/custom-types.md new file mode 100644 index 0000000..dcf2c80 --- /dev/null +++ b/docs/simple/custom-types.md @@ -0,0 +1 @@ +# Placeholder diff --git a/docs/simple/index.md b/docs/simple/index.md index 98eab36..b2458d7 100644 --- a/docs/simple/index.md +++ b/docs/simple/index.md @@ -1 +1,32 @@ -# Simple API +# Simple API Overview + +The main purpose of this Simple API is to provide 95% of the functionality required to write any database integration you need without having to expose yourself to any of the sharp corners of the C API. + +## Philosophy + +We have tried to stay consistent in the design of this API in order to try and make writing your code as unobtrusive as possible. Here are the principles we have tried to follow: + +### Transactions and Code Blocks + +When one writes a database application it is fairly common for us to batch SQL Statements into transactions. The reason for this is because we want these statements to either succeed or fail as one unit. If the statements succeed, then we `commit` the transaction. If they fail, we `rollback` the transaction, and it's like none of the statements were executed at all. + +Pony has a mechanism that is well suited to that - partial functions! + +```pony +try + statement_handle.direct_exec("some SQL command")? + statement_handle.direct_exec("some other SQL command")? + statement_handle.direct_exec("etc etc etc ...")? + database_handle.commit() +else + database_handle.rollback() +end +``` + +### SQL Datatypes + +There are two ways to get data in and out of the ODBC API, one is via native C types, the other is via a textual representation. After much experimentation and discovery that consistency does not seem to be a virtue across database vendors - we have chosen to use the textual representation. + +This means that all SQL Datatypes are read and written via text buffers - the size of which needs to be determined in advance. + +This means that the addition of custom database types can be done trivially, by defining their buffer's size, and writing type-specific read() and write() functions. See [Implementing Custom Types](https://redvers.github.io/odbc-tutorial/simple/custom-types.html) for more details. diff --git a/docs/simple/populating.md b/docs/simple/populating.md new file mode 100644 index 0000000..dcf2c80 --- /dev/null +++ b/docs/simple/populating.md @@ -0,0 +1 @@ +# Placeholder diff --git a/docs/simple/queries.md b/docs/simple/queries.md new file mode 100644 index 0000000..dcf2c80 --- /dev/null +++ b/docs/simple/queries.md @@ -0,0 +1 @@ +# Placeholder diff --git a/docs/simple/sqltypes.md b/docs/simple/sqltypes.md new file mode 100644 index 0000000..0d746f1 --- /dev/null +++ b/docs/simple/sqltypes.md @@ -0,0 +1,42 @@ +# About SQL Types + +As mentioned previously, all data is written in and out of the database via textual buffers. In principle you can just use the SQLVarchar type for all input and output parameters and do the type conversions yourself. However, we have provided the default ODBC SQL Types as follows: + +| ODBC SQL Type | Pony API Type | Pony Type | Max String Size | +|----------------|-----------------|-----------|-----------------| +| SMALLINT | SQLSmallInteger | I16 | 8 | +| INTEGER | SQLInteger | I32 | 15 | +| BIGINT | SQLBigInteger | I64 | 22 | +| FLOAT | SQLFloat | F32 | 20 | +| REAL | SQLReal | F64 | 30 | +| VARCHAR | SQLVarchar | String | N/A | + +## All about NULL + +For every SQL Datatype we create we have to also support a Nullable version. Instead of doing this by defining duplicate types for everything, we chose instead to take the following approach: + +## Reading NULLs + +```pony +var my_sql_integer: SQLInteger = SQLInteger +/* + * Stuff happens here which we shall explain on the + * next page which will populate this variable with + * the result of our query. */ + +if (my_sql_integer.is_null()) then + // The value returned was SQLNull from the database +else + var myint: I32 = my_sql_integer.read()? +end +``` + +If your schema has marked a column as `NOT NULL`, then you can safely call `read()?` without testing for NULL. + +If however you try to `read()?` the value directly without testing for NULL and it is NULL, the function will error. + +## Writing NULLs + +All pony SQL Types default to NULL, so all that needs be done is to create your object and not set a value. + +If you are reusing an object, you can either call `reset()` or `null()` (if you want to be more explicit). diff --git a/docs/simple/stage1.md b/docs/simple/stage1.md new file mode 100644 index 0000000..06ca2b0 --- /dev/null +++ b/docs/simple/stage1.md @@ -0,0 +1,78 @@ +# Including pony-odbc + +Let's go ahead and create a new pony project. + +```shell +red@panic:~/projects$ mkdir psql-demo +red@panic:~/projects$ cd psql-demo +red@panic:~/projects/psql-demo$ corral init +red@panic:~/projects/psql-demo$ corral add github.com/redvers/pony-odbc.git --version 0.3.0 +red@panic:~/projects/psql-demo$ corral fetch +git cloning github.com/redvers/pony-odbc.git into /home/red/projects/psql-demo/_repos/github_com_redvers_pony_odbc_git +git checking out @0.3.0 into /home/red/projects/psql-demo/_corral/github_com_redvers_pony_odbc +red@panic:~/projects/psql-demo$ +``` + +Let's create a very minimal Makefile + +```make +all: + corral run -- ponyc -d + ./psql-demo +``` + +... and our initial main.pony + +```pony +use "pony-odbc" +use "lib:odbc" // For unixODBC. For iODBC, use "lib:iodbc" + +actor Main + let env: Env + + new create(env': Env) => + env = env' +``` + +Now go ahead and run make, and run ldd to double-check that the library linked correctly: + +```shell +red@panic:~/projects/psql-demo$ make +corral run -- ponyc -d + exit: Exited(0) + out: + err: Building builtin -> /home/red/.local/share/ponyup/ponyc-release-0.59.0-x86_64-linux-ubuntu24.04/packages/builtin +Building . -> /home/red/projects/psql-demo +Building pony-odbc -> /home/red/projects/psql-demo/_corral/github_com_redvers_pony_odbc/pony-odbc +Building debug -> /home/red/.local/share/ponyup/ponyc-release-0.59.0-x86_64-linux-ubuntu24.04/packages/debug +Building ffi -> /home/red/projects/psql-demo/_corral/github_com_redvers_pony_odbc/pony-odbc/ffi +Building collections -> /home/red/.local/share/ponyup/ponyc-release-0.59.0-x86_64-linux-ubuntu24.04/packages/collections +Building pony_test -> /home/red/.local/share/ponyup/ponyc-release-0.59.0-x86_64-linux-ubuntu24.04/packages/pony_test +Building time -> /home/red/.local/share/ponyup/ponyc-release-0.59.0-x86_64-linux-ubuntu24.04/packages/time +Building random -> /home/red/.local/share/ponyup/ponyc-release-0.59.0-x86_64-linux-ubuntu24.04/packages/random +Generating + Reachability + Selector painting + Data prototypes + Data types + Function prototypes + Functions + Descriptors +Verifying +Writing ./psql-demo.o +Linking ./psql-demo + +./psql-demo +red@panic:~/projects/psql-demo$ ldd ./psql-demo + linux-vdso.so.1 (0x00007e36767fb000) + libodbc.so.2 => /lib/x86_64-linux-gnu/libodbc.so.2 (0x00007e3676748000) + libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007e367665f000) + libatomic.so.1 => /lib/x86_64-linux-gnu/libatomic.so.1 (0x00007e3676654000) + libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007e3676626000) + libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007e3676400000) + /lib64/ld-linux-x86-64.so.2 (0x00007e36767fd000) + libltdl.so.7 => /lib/x86_64-linux-gnu/libltdl.so.7 (0x00007e3676619000) +red@panic:~/projects/psql-demo$ +``` + +Note that libodbc.so.2 is linked into our executable. diff --git a/docs/simple/tablecreate.md b/docs/simple/tablecreate.md new file mode 100644 index 0000000..2f34463 --- /dev/null +++ b/docs/simple/tablecreate.md @@ -0,0 +1,87 @@ +# Creating Our Tables + +In order to create our tables, we simply execute the SQL commands. As there are no parameters, we can use the function `direct_exec("my sql statement")?` which will either succeed or fail. + +## Creating Statements + +In our application, let's make a Statement Handle and pass it to a function which will create our tables. Here is our full example thus far: + +```pony +use "debug" +use "pony-odbc" +use "lib:odbc" + +actor Main + let env: Env + + new create(env': Env) => + env = env' + + let enh: ODBCEnv = ODBCEnv + try + let dbh: ODBCDbc = enh.dbc()? + dbh.connect("psql-demo")? + let sth: ODBCStmt = dbh.stmt()? + try + create_tables(sth)? + else + Debug.out("Our create_tables() function threw an error") + error + end + else + Debug.out("Our application failed in some way") + end +``` + +Now lets create our function to make these (temporary) tables. + +```pony + fun create_tables(sth: ODBCStmt)? => + .> direct_exec( + """ + CREATE TEMPORARY TABLE play ( + id BIGSERIAL, + name VARCHAR(30) NOT NULL + ); + """)? + .> direct_exec( + """ + ALTER TABLE play ADD CONSTRAINT play_pkey PRIMARY KEY (id); + """)? + .> direct_exec( + """ + CREATE TEMPORARY TABLE player ( + id BIGSERIAL, + name VARCHAR(20) NOT NULL + ); + """)? + .> direct_exec( + """ + ALTER TABLE player ADD CONSTRAINT player_pkey PRIMARY KEY (id); + """)? + .> direct_exec( + """ + CREATE TEMPORARY TABLE line ( + id BIGSERIAL, + id_play INTEGER, + id_player INTEGER, + playerlinenumber INTEGER, + actsceneline VARCHAR(15), + playerline VARCHAR(127) NOT NULL + ); + """)? + .> direct_exec( + """ + ALTER TABLE line ADD CONSTRAINT line_pkey PRIMARY KEY (id); + """)? + .> direct_exec( + """ + ALTER TABLE line ADD CONSTRAINT line_id_play_fkey FOREIGN KEY (id_play) REFERENCES play(id); + """)? + .> direct_exec( + """ + ALTER TABLE line ADD CONSTRAINT line_id_player_fkey FOREIGN KEY (id_player) REFERENCES player(id); + """)? +``` + +Since we are not reading or writing any parameters in this example, we can simply use the `direct_exec()?` function which does, as the name suggests - direct execution. diff --git a/docs/simple/transactions.md b/docs/simple/transactions.md new file mode 100644 index 0000000..dcf2c80 --- /dev/null +++ b/docs/simple/transactions.md @@ -0,0 +1 @@ +# Placeholder diff --git a/docs/simple/tutorial.md b/docs/simple/tutorial.md new file mode 100644 index 0000000..423dfb0 --- /dev/null +++ b/docs/simple/tutorial.md @@ -0,0 +1,57 @@ +# Simple API Tutorial + +In this tutorial we're going to write a somewhat simple database application to store and query lines from two of William Shakespeare's plays. + +Admittedly, the Schema will be over-engineered for demonstration reasons. + +In our application we will check to see if our table exists, and if not, we'll create it. + +Then we will parse the JSON files in the data/ directory and populate the tables. + +Then we will do various queries on our data. + +## Schema + +![Schema Image](../assets/shakespeare-schema.png) + +The tables are defined as follows: + +### Table: play + +```sql +CREATE TABLE play ( + id BIGSERIAL, + name BIGSERIAL NOT NULL +); + +ALTER TABLE play ADD CONSTRAINT play_pkey PRIMARY KEY (id); +``` + +### Table: player + +```sql +CREATE TABLE player ( + id BIGSERIAL, + name VARCHAR(20) NOT NULL +); + +ALTER TABLE player ADD CONSTRAINT player_pkey PRIMARY KEY (id); +``` + +### Table: line + +```sql +CREATE TABLE line ( + id BIGSERIAL, + id_play INTEGER, + id_player INTEGER, + playerlinenumber INTEGER, + actsceneline VARCHAR(15), + playerline VARCHAR(127) NOT NULL DEFAULT 'NULL' +); + +ALTER TABLE line ADD CONSTRAINT line_pkey PRIMARY KEY (id); + +ALTER TABLE line ADD CONSTRAINT line_id_play_fkey FOREIGN KEY (id_play) REFERENCES play(id); +ALTER TABLE line ADD CONSTRAINT line_id_player_fkey FOREIGN KEY (id_player) REFERENCES player(id); +``` diff --git a/mkdocs.yml b/mkdocs.yml index 8097868..673ec4c 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -78,6 +78,15 @@ nav: - Which API Should I Use?: "start/which.md" - The "Simple" API: - Overview: "simple/index.md" + - Tutorial Schema: "simple/tutorial.md" + - Including pony-odbc: "simple/stage1.md" + - Connecting To Our Database: "simple/connecting.md" + - Creating Our Table: "simple/tablecreate.md" + - About SQL Types: "simple/sqltypes.md" + - Populating Our Table: "simple/populating.md" + - Simple Queries: "simple/queries.md" + - Transactions, Commits, and Rollbacks: "simple/transactions.md" + - Implementing Custom Types: "simple/custom-types.md" - The "Raw" API: - Overview: "raw/index.md" - The "ORM" API: