- Using randomly generated subdomains and/or folder structures.
- Embedding stuff in Base64.
- Using monolithic scripts to build the page.
It's possible that I'm the point of failure here. I really only know enough web development to keep my adblock list up to date (My development days were circa HTML 2-3). If sites are willing to go through the trouble (and server cycles) to generate this stuff on every pageload, are we well and truly stuck?
EDIT: As an added behavior. I've noticed that there might be elements that normally load instantly. I can block these elements, and they are gone when I reload. But they don't STAY gone. If I watch the page loading (in the element inspector in chrome/firefox), scripts seem to be re-adding the removed elements after the page has been fully loaded. It seems like once an element has been removed once, if a script re-adds it, the filter does not remove it again.