Facebook's data scientists are pushing back against the notion that the social network is an echo chamber that reinforces people's political and social views by just serving up content that matches users’ existing beliefs. This notion comes based on the fact that Facebook uses algorithms to filter what you see based on what you have previously viewed or liked, isolating users from stories that may differ from their own beliefs. In a recent study, Facebook researchers found that although the stories users see reflect their ideological preferences, they are still exposed to different points of view. Of course, critics point out that Facebook is always tweaking the algorithms of the News Feed feature and could easily make changes that would create a more sealed echo chamber.