-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Persisting Problog Python objects #93
Comments
Perhaps of interest (source), you can extend a database so you can create 1 "base" database and for each batch of dynamically added facts you can create an extension of that base database. Throwing away an extended database when you no longer need it.
So something more like # -- prepare static db part --
static_problog_string = """
:- use_module(library(db)).
:- csv_load(‘static_facts.csv’,’static_facts’). % This is huge but static.
P::static_fact(X) :- static_facts(P,X).
P::dynamic_fact(Y) :- dynamic_facts(P,Y).
predicate_instance(X,Y) :- static_fact(X), dynamic_fact(Y), some_relation(X,Y).
query(predicate_instance(_,_)).
"""
static_db = DefaultEngine().prepare(PrologString(static_problog_string))
# while loop for dynamic part
while not stop:
# -- complete program --
extended_db = static_db.extend()
dynamic_data = get_dynamic_data()
# instead of writing dynamic_data into csv
# and then writing csv into ProgLog db
# consider writing straight into the extended_db.
# not sure what efficiency difference will be...
write_dynamic_csv(dynamic_data, ‘dynamic_facts.csv’)
extended_db += PrologString(":- csv_load(‘dynamic_facts.csv’,’dynamic_facts’).")
# -- evaluate --
# if you do not time the separate steps, you can create them at once
# using Problog_Output = SDD.create_from(extended_db).evaluate()
logic_formula = LogicFormula.create_from(extended_db) # ground program
directed_acyclic_graph = LogicDAG.create_from(logic_formula) # break cycles
sentential_decision_diagram = SDD.create_from(directed_acyclic_graph) # compile CNF to SDD
Problog_Output = sentential_decision_diagram.evaluate()
stop = determine_to_stop(Problog_Output) Something that may help is passing sdd_auto_gc=True to SDD.create_from , sometimes it becomes slower though..
|
Suppose I have
write_static_csv(static_data, ‘static_facts.csv’)
while not stop:
Is there a way to presist the Problog engine, keeping the static facts between iterations,
but subtracting the old and adding the new dynamic data without creating the Logic formulas, DAG, and SDD objects?
Also, what are some possible ways speed up the Problog execution steps.
Currently, sentential_decision_diagram.evaluate() takes 20+ minutes for me. Thanks.
The text was updated successfully, but these errors were encountered: