- Using randomly generated subdomains and/or folder structures.
- Embedding stuff in Base64.
- Using monolithic scripts to build the page.
Example: A page might be blank, but call a single js file that it uses to build the page and add the elements it needs. The js seems to be built on demand with different randomized values on every pageload. All images (including the ones you want) will be encoded in base64 directly in the elements.
It's possible that I'm the point of failure here. I really only know enough web development to keep my adblock list up to date (My development days were circa HTML 2-3). If sites are willing to go through the trouble (and server cycles) to generate this stuff on every pageload, are we well and truly stuck?
EDIT: As an added behavior. I've noticed that there might be elements that normally load instantly. I can block these elements, and they are gone when I reload. But they don't STAY gone. If I watch the page loading (in the element inspector in chrome/firefox), scripts seem to be re-adding the removed elements after the page has been fully loaded. It seems like once an element has been removed once, if a script re-adds it, the filter does not remove it again.