-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pyswip and thread #3
Comments
Hi, Im also facing the same issue while trying a prolog query in my code python analytics.py(Thread-1 ) wait_for_reading_data from redis db Code Debug using gdb gdb -ex r --args python analytics.pyGNU gdb (Ubuntu 7.7.1-0ubuntu5~14.04.2) 7.7.1 Program received signal SIGSEGV, Segmentation fault. ################################################################################# Regards, |
Did anyone make any progress on this? I am having the exact same issue when using Pyswip in combination with Flask |
I've devised an ad-hoc non-intrusive solution, just to test things out:
|
IMO when using Prolog's foreign language interface it is usually better to think of the PL as a service that runs on a worker thread and to which you send jobs from other threads, then ask the PL service if it's completed. This is particularly well-suited to having the PL execute a batch of tasks that you can assign for execution some time before you need to collect the results. This is a little system I built a couple of days ago.
With my_pl_file.pl containing:
In this example the main thread sends a job (or a series of jobs) to the PlManager object, which has started a worker loop on another thread. Each job submitted returns a ticket number, and in this instance the main thread has nothing better to do than wait for the job to be completed. But it might be able to submit the jobs earlier in the frame and have basically zero latency when collecting results from SWI-Prolog. Output of running this program:
(Note that only the results of the last of the 32 jobs submitted is printed out; ticketID ends up being assigned to the value returned from submitting the final job.) There are some downsides to this specific approach. In particular, I think (?) PySwip returns its query results as a generator object which handles the interactive nondeterminism of Prolog, generating solutions on the fly as necessary. In my approach the worker thread simply evaluates all solutions pre-emptively and stores them. This could be problematic if the state space were large (or infinite), for reasons of both processing time and storage space. Having two separate thread locks might seem over-engineered but so far as I can tell there's no need to block a thread posting a new job just because the manager is in the middle of updating the "completed" dict. Oh, and originally I used polymorphic PlJob objects that had subtypes for general queries and file consults, but realized I could just use the general query method for the consult. But the ability to have polymorphic job classes might prove useful. |
Is there any progress on this? Sadly the solution from @xpinguin didn't work for me. |
Could you please quote the error you've been faced with? While I've departed from the Python "scene" towards the C++ (being the sort of "mother tongue" for me), I still feel for Prolog, so I'll try my best to solve the problem you had been facing. |
Thank you, but I moved the whole project to Python now. As I recall, there was not really an error, but just not using the prolog file which resulted in nothing being returned. |
…est yuce#3) SW-17379 Improve functor representation in lists Approved-by: Guglielmo Gemignani <[email protected]>
is this issue of multithreading in pyswip still there or it is solved. @yuce |
Faced similiar result with Flask |
Hello,
I think there is a segmentation fault error when pyswip is used in a thread. Here is a simple code to reproduce the error:
When I debug the program, the segmentation fault error occurs at this line (prolog.py, line 91):
swipl_fid = PL_open_foreign_frame()
Regards,
Bartholomew.
The text was updated successfully, but these errors were encountered: