I said the platform *could* be in trouble, depending. Depending on what? Depending on a lot: what the platform does, what it holds itself out as, what it actually is, how it acts, what was said by its users, what their mods did or didn't do, what happened next.
One example: This website, ImgFlip, does employ content moderation. That puts it in the bucket of "content creator" under your own definition.
Further, people make the mistake of assuming laws are static, that new legal precedents can't be set, that FCC rules in effect one day cannot be reversed the next. This is an enormously active topic in the law and I would always err on the side of caution.
Not to mention: law is not the only proper mode of analysis when dealing with these issues.
In a moral sense, online platforms are certainly responsible for failing to mod any content that they ought to.
In a business sense, a mass shooting or two accompanied by a racist manifesto that can be directly linked up with a platform is not great for branding.