Zend certified PHP/Magento developer

Opening, reading, and closing same file for multiple requests: do I need to do anything special to optimize this, or can I let the OS be responsible for caching the file in RAM?

I’m building a Plex clone that will use MPEG DASH for adaptive bitrate streaming. Basically there’s two ways to do this. One, you break the media file into many discrete files that get returned at the client’s request as you would expect. No special logic on the backend is required for this. The other way is to have a single, special “fragmented” media file which is basically the same as the discrete files concatenated together. That’s how I want to do it. When the client requests the next chunk, the server (my php code) must open the file, seek to the appropriate byte range, read the chunk, then close the file. This would be happening every 5-10 seconds during the stream. Do I need to worry about this, or can I expect the OS to see that this is a frequently accessed file and cache the whole thing in memory so I’m not hitting the filesystem every time? Mass scalability is not a concern, since the application would only ever be designed for individual use with likely no more than half a dozen or so simultaneous streams. But I would still like to follow best practices.

submitted by /u/-Clem
[link] [comments]