While Filter Bubbles largely emerge due to algorithmic curation, Echo Chambers arise from the social tendency that people tend to seek out and engage with viewpoints that confirm their pre-existing beliefs and biases. This occurs not just on digital platforms but also in physical spaces - people sort themselves into clubs, religious institutions, and friend groups according to their shared preferences.
The formation of echo chambers is driven by confirmation bias, the human inclination to favor information that aligns with one's existing views and to discount information that doesn't. Digital platforms can inadvertently enable this behavior by encouraging users to selectively follow, friend, or interact with people or sources that share their perspectives, and avoid those that challenge them. Over time, users build a self-curated information space that largely echoes their own beliefs back to them, which reinforces their existing ideologies.
The internet doesn't fundamentally change the element of human nature by which Echo Chambers are made and ossify. Online communities, like those in the real world, will tend to draw together like-minded folks, and by enabling folks' self-sorting, platforms are only extending a challenge of the physical world into a digital space, albeit one that we spend increasing volumes of time in.
One caveat: Platforms enable the formation of communities that might otherwise be too infrequent or taboo in the population to allow associations through proximity to generate community. Robust queer communities were hallmarks of the early internet precisely because the network enabled connections at a distance for folks that were unable to form analogous connections in real life. The internet enabled queer folks to overcome their demographic infrequency, and pervasive (at the time) taboos to "find their people".
Since the phenomenon of Echo Chambers is ultimately a function of common psychology, and is clearly not unique to platforms, many platforms take no active steps to tackle this problem.
Though digital platforms don't play the major part in creating echo chambers, their design can make the problem dramatically better or worse. Interventions that intentionally break down the narrowness of groups have the capacity to break down echo chambers. For example: platforms could have shared (global) home pages, intentional diversity in friend recommendations, or intentionally introduce authoritative information into content feeds. Any of these strategies, though they might seem like small steps, would play a role in reducing the pull of Echo Chambers.