Imagine a simple route like this:
app.get('/test',
apiCall1(),
apiCall2(),
apiCall3(),
apiCall4(),
apiCall5(),
renderPage()
How would you prevent this code from being executed by the same client while said client is spamming GET requests (either by refresh page or clicking buttons, etc.) to the same route ?
I'm using express (+session, static, bodyparser) and have failed to figure this out on my own.
For example when I first visit /test in the browser, it works fine. But if I then quickly refresh the page twice in a row, I get to the route twice for the same session, which in turn doubles the amount of API calls from 5 to 10.
I want it to only occur once, and render the page. I've tried session variables like "isRequesting true/false" and such, but nothing works, because the response object changes upon a new request, meaning renderPage cant use res.send on the "latest" response object.
I'd add another middleware at the head of the chain that keeps track of the accesses by a given client (session ID? IP address? Identifying a client can be its own problem, depending!), and errors (with next(an error here)) if the user is over-requesting. Defining that is up to you.
Now that is an interesting case, and I think an ideal case for promises: In the first request for a client, create a promise and attach it AND its resolve function to some storage that will outlast the request. If it already exists, don't.
Call thepromise.then() with the renderPage code.
Call next() if you set up a new promise.
Let renderPage resolve that promise instead of rendering.
var promises = {};var resolvers = {};function startup(req, res, next) {var mainRequest = false;if (!promises[req.clientID]) {promises[req.clientID] = new Promise(function (resolve, reject) {resolvers[req.clientID] = { resolve: resolve, reject: reject };mainRequest = true;});}promises[req.clientID].then(function (data) { // This is where your rendering ACTUALLY goes.cleanup();res.render(data);}).catch(function (err) {cleanup();res.render('errors');// Note that you CAN'T call next(err) here since next has already been called in some branches. c'est la vie. You can work around it but it's ugly.});if (mainRequest) next();function cleanup() {delete promises[req.clientID];delete resolvers[req.clientID];}}
And renderPage:
And error handling:functiion renderPage(req, res, next) {resolvers[req.clientID].resolve(the data used to render); }
function errorHandler(req, res, next, err) {resolvers[req.clientID].reject(err); }
(that could be in your final apiCall5, but I'd probably factor it separately.)
What you've now done is joined these requests. They'll all complete when the promise is resolved.
But if a user hits Refresh (F5) twice instantly for example, I only want the code to run once. But refreshing twice means two response objects. And I can only send a valid response to the 2nd created response object. (The first one wont work, according to tests) so I can't ignore requests either
There's not a system on the planet that will make that work the way you want it to.
Once they hit refresh they have broken the connection to the backend. It's not the same connection any more (well it might be under very certain circumstances but let's not get into that).
Now you *might* be able to detect when/if that connection breaks and stop processing further API calls (but detecting broken connections isn't always trivial), but still you have to run all those API calls for the last request.
Your best (perhaps only) way around this is to use AJAX on the front end and have the front-end make those API calls individually to individual endpoints.
Why can’t your GET request be called multiple times? If it changes data, then you
should make it a POST/PUT/PATCH request;
Then with the POST/PUT/PATCH request, you could have an authenticity token
that you verify is correct before processing any requests. (Which is sort of what
Aria is suggesting).
If the route doesn’t change or create data, and simply reads it, then you could set
up caching for your API Requests.
However, given your message on 13th November at 7:38 pm UTC, I’d be inclined
to say that your misusing GET requests if your “creating” things.
should make it a POST/PUT/PATCH request;
Then with the POST/PUT/PATCH request, you could have an authenticity token
that you verify is correct before processing any requests. (Which is sort of what
Aria is suggesting).
If the route doesn’t change or create data, and simply reads it, then you could set
up caching for your API Requests.
However, given your message on 13th November at 7:38 pm UTC, I’d be inclined
to say that your misusing GET requests if your “creating” things.
I fail to see where I said the GET requests are "creating"/"adding" anything, so I apologize if my explanation caused such a confussion.
The GET requests are merely fetching data. Data which is very likely to change more than once an hour, which is why caching isn't an option.
I've solved the performance issues by pooling the database connections instead, and it's working fine now. Although there's still a lot of unneccessary resources being used in relation to the database when spamming requests like this. That's why I want a decent way to prevent spamming Get requests from the same client, same time, same view.
> The GET requests are merely fetching data. Data which is very likely to change more than once an hour, which is why caching isn't an option.
You can, of course, write your code to cache things for any length of time that makes sense for you. It doesn't have to be an hour. Caching for just 5 minutes or even 1 minute would not be noticeable to the user in many cases, and could help performance.
You can, of course, write your code to cache things for any length of time that makes sense for you. It doesn't have to be an hour. Caching for just 5 minutes or even 1 minute would not be noticeable to the user in many cases, and could help performance.
댓글 없음:
댓글 쓰기