-
Notifications
You must be signed in to change notification settings - Fork 10
WiP: Save state in LO Event #209
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
f1d279c
e36ee46
acaec6b
ea7873a
0c4fdaf
e47734b
13ab13c
c30165d
14dfb25
bc54dac
feb6514
6fea635
92d5624
0c0897c
6a01cf8
fa9f587
07dfb26
ebde497
b30dee4
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,36 @@ | ||
| import learning_observer.kvs | ||
| import learning_observer.stream_analytics.helpers as sa_helpers | ||
|
|
||
| def state_blob(): | ||
| '''Dummy function for the reducer name portion of the | ||
| KVS key | ||
| ''' | ||
| pass | ||
|
|
||
| def _make_key(user_id, source, activity): | ||
| '''Helper function to format keys for the KVS | ||
| ''' | ||
| key = sa_helpers.make_key( | ||
| state_blob, | ||
| { | ||
| sa_helpers.EventField('source'): source, | ||
| sa_helpers.EventField('activity'): activity, | ||
| sa_helpers.KeyField.STUDENT: user_id | ||
| }, | ||
| sa_helpers.KeyStateType.INTERNAL | ||
| ) | ||
| return key | ||
|
|
||
| async def fetch_blob(user_id, source, activity): | ||
| '''Fetch the blob from the KVS | ||
| ''' | ||
| key = _make_key(user_id, source, activity) | ||
| kvs = learning_observer.kvs.KVS() | ||
| return await kvs[key] | ||
|
|
||
| async def save_blob(user_id, source, activity, blob): | ||
| '''Store a blob in the KVS | ||
| ''' | ||
| key = _make_key(user_id, source, activity) | ||
| kvs = learning_observer.kvs.KVS() | ||
| await kvs.set(key, blob) | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -38,6 +38,7 @@ | |
| import learning_observer.auth.events | ||
| import learning_observer.adapters.adapter | ||
| import learning_observer.blacklist | ||
| import learning_observer.blob_storage | ||
|
|
||
| import learning_observer.constants as constants | ||
|
|
||
|
|
@@ -427,7 +428,7 @@ async def decode_lock_fields(events): | |
| ''' | ||
| async for event in events: | ||
| if event['event'] == 'lock_fields': | ||
| if event['fields'].get('source', '') != lock_fields.get('source', ''): | ||
| if 'source' not in event['fields'] or event['fields'].get('source', '') != lock_fields.get('source', ''): | ||
| lock_fields.update(event['fields']) | ||
| else: | ||
| event.update(lock_fields) | ||
|
|
@@ -447,6 +448,35 @@ async def filter_blacklist_events(events): | |
| await ws.send_json(bl_status) | ||
| await ws.close() | ||
|
|
||
| async def process_blob_storage_events(events): | ||
| '''HACK This function manages events related to storing and | ||
| retrieving blobs from server-side storage. It is primarily | ||
| used for LO Assess. Ideally, this functionality should reside | ||
| in an independent module, rather than being directly integrated | ||
| into Learning Observer, as it is currently implemented. | ||
| ''' | ||
| async for event in events: | ||
| # Extract metadata | ||
| if event['event'] in ['save_blob', 'fetch_blob']: | ||
| user_id = event['auth']['user_id'] | ||
| source = event['source'] | ||
| activity = event['activity'] | ||
|
|
||
| # Save, fetch, or ignore (continue) | ||
| if event['event'] == 'save_blob': | ||
| await learning_observer.blob_storage.save_blob( | ||
| user_id, source, activity, | ||
| event['blob'] | ||
| ) | ||
| elif event['event'] == 'fetch_blob': | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @bradley-erickson I am wondering how this will play with client-side code. Ideally, this would be spent back before we're done with metadata / init / headers / auth (before the events start streaming). We should chat in the morning about whether the client should request this when opening the websocket or whether this should be automatically sent. |
||
| blob = await learning_observer.blob_storage.fetch_blob(user_id, source, activity) | ||
| await ws.send_json({ | ||
| 'status': 'fetch_blob', | ||
| 'data': blob | ||
| }) | ||
| else: | ||
| yield event | ||
|
|
||
| async def check_for_reducer_update(events): | ||
| '''Check to see if the reducers updated | ||
| ''' | ||
|
|
@@ -470,6 +500,7 @@ async def process_ws_message_through_pipeline(): | |
| events = decode_lock_fields(events) | ||
| events = handle_auth_events(events) | ||
| events = filter_blacklist_events(events) | ||
| events = process_blob_storage_events(events) | ||
| events = check_for_reducer_update(events) | ||
| events = pass_through_reducers(events) | ||
| # empty loop to start the generator pipeline | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What happens if the key is missing from the kvs? Should probably return that there was nothing (empty object or
None).There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's fine for now.
nulland notNone.In the long term, we should support having a default value. In the long term, I would envision this being part of a module rather than the core. But we've got a long way before we're ready for that.
(Also, note that's different from
json.dumps("null")which is'"null"')