Quantcast
Channel: apache (Forum tag)
Viewing all articles
Browse latest Browse all 34

Parallel Multithreaded Processing with CURL Lamp or similar V 15

$
0
0

I have written a PHP task/job scheduler... it fires off jobs which are PHP scripts that perform various tasks.

Currently its working pretty well, but every now and again I find that some of the jobs are "stuck" as another is currently running holding them up.  

I had attempted to not allow this to happen via using the following to launch these jobs... this is from the terminal I am testing this and I use the PHP "exec()" command to fire within my code.

bash -c "exec nohup setsid curl -s 'http://localhost/MYJOBHERE_1.PHP'> /dev/null 2>&1 &"

When I have multiple to run I am just sending the same as above but adding another ampersand --> & to the beginning of the string so it becomes one long command like so:

bash -c "exec nohup setsid curl -s 'http://localhost/MYJOBHERE_1.PHP'> /dev/null 2>&1 &"& bash -c "exec nohup setsid curl -s 'http://localhost/MYJOBHERE_2.PHP'> /dev/null 2>&1 &"& bash -c "exec nohup setsid curl -s 'http://localhost/MYJOBHERE_3.PHP'> /dev/null 2>&1 &"

Now if I send like a bunch of these on the terminal that call scripts that write to a DB table so I can watch the time stamps... it seems like they all run, but at very different times... is this the operating system deciding to prioritize them or what?    Is there any way to make sure they all run at the same time without causing other processes to slow them down?  

The biggest issue is that some tasks are really of higher priority than others... is there a way for me to assign a priority to those tasks using the method I have?

Thanks for any an all input.

Forum: 

Viewing all articles
Browse latest Browse all 34

Trending Articles