And how will Imgflip know that the user is under 18?
I suggest adding an optional ID verification when creating an account. You must show a photo of yourself + your ID and a moderation team will determine if the user is +18. If you don't want to do ID verification, you can continue as a +13 account, but you will not have access to NSFW memes.
With the more restrictive NSFW in mind, Imgflip could loosen their Terms to include content that is legal to +18 users, such as consensual pornography. Conversely, Imgflip should also not put most strong language (except for slurs, they are hate speech) under the proposed NSFW filter, as it is much more powerful than the current NSFW filter. However, if coarse language is used as a clear attack to the user, or slurs are used: the meme would be deleted with no exceptions.
Also, I think that +13 users should not be able to see the NSFW meme on the homepage at all. Currently, they can see a message that says the image is NSFW. I think the entire image should not appear so that kids that see the image are not encouraged to think "I need to see this!". NSFW memes should also appear less on the homepage. It would be done by reducing upvote power by 10x. In other words, an NSFW image with 10 upvotes will have the same exposure as a SFW meme with 1. I'm not sure if Imgflip does this or the kids are just rightfully scared of turning the filter on, but one meme of mine was SFW, and became NSFW when it was on the front page, which caused it to be kicked it out of the page. If Imgflip already does this, good.