This seems to me to be a solution (and one that will have lots of problems) that does not address an actual problem. If you have large amounts of data that you need to be able to store and search quickly and efficiently there are plenty of better solutions out there for that.
If you're doing this as a learning exercise then best of luck with it, but it seems to me to be something of a waste of both time and effort.
-Stuart
On Sun, 14 Apr 2019 at 07:37, Rene Veerman <rene.veerman.netherlands@xxxxxxxxx> wrote:
Well I intend to store and make searchable and aggretable large quantities of JSON, which always need a json_decode() into RAM from a filesystem.And I'd really like to keep the daemon that holds such decoded data in RAM also written in PHP.Can you provide some example code for a PHP daemon, started from init.d/ , which listens for CURL requests on a custom port?On Sat, Apr 13, 2019, 18:17 Stuart Dallas <stuart@xxxxxxxx> wrote:I'm confused about exactly what you're proposing. What exactly is it?A "curl-able daemon" would be anything that talks the HTTP protocol. This is pretty trivial to implement in PHP but I'd recommend against it. There are reasons why we tend to use a separate HTTP server when deploying web services written in PHP.Would you care to elaborate on the details of your idea?-StuartOn Sat, 13 Apr 2019 at 15:27, Rene Veerman <rene.veerman.netherlands@xxxxxxxxx> wrote:I need to avoid the overhead on gazillions of JSON and BSON decide operations, so I need something of a curl-able daemon running on the same machine as a standard Apache + PHP website is run from, _in PHP_.A simple solution that can run from Ubuntu in init.d with a call to command line PHP will do nicely.And I fully intend to open source this one to all :)It'll be called FolderDB.Thanks for your time.