What are some good non-American films

Why are American films so good? I mean, for example, on the contrary to German films?

Hello,

I have noticed that in almost all American series everything American is portrayed as good. The US government and the police are played down and even portrayed as saviors, the only thing that is interesting there is America, all other countries are unimportant. I also notice that enemies come from other countries quite often. For example, there are often Russians, Italians or Chinese who are portrayed as evil.

American life is also portrayed as perfect, or ideal. Also, quite often, only American celebrities are mentioned, and so on. At least that's what I noticed.

So could it be said that a lot of such films are US propaganda?

I don't want to attack America here, but it seems to me that America is always portrayed as better and flawless in these films. Problems such as racism, the death penalty, gun laws, corrupt police officers, etc. Are not always shown and if they are then played down as if it were normal or not that bad ...

A few examples in which I noticed something like this:

Dexter

New girl

Grownups 2

Supernatural

How I met your mother

But there are also series such as Prison Break in which this is well discussed and I would like to emphasize again that I have nothing against Americans and don't want to offend anyone. I ask you to give factual answers.

What do you think? Thank you for your answers! :)