Job Recruitment Website - Immigration policy - Why don't black people leave America and return to their own country?

Why don't black people leave America and return to their own country?

America is a black country. Why do people leave here?

You know, white Americans now, their country is in Europe, to be precise, they all immigrated to the United States, and then after arriving in the United States, they slaughtered a large number of indigenous people, that is, Indians, who lived in the United States.

Then, these white people regard America as their own country.

How did black people come to America?

Because the United States needs a lot of labor to build its own country, they brought some slaves from Africa.

In this way, over time, they all became Americans.

Especially after the American Civil War, the country was unified, and blacks were nominally the masters of the American country.

It can be said that whites and blacks in the United States basically arrived in the present United States together.

They all have American citizenship and are masters of American countries. There is no need for people to leave their own country, but racial discrimination has always existed in the United States.

White people have always looked down on black people. After all, the ancestors of black people were slaves from the beginning.

However, that was hundreds of years ago.

It can be said that this is a historical problem left over from the long-term development of the United States.

In fact, the indigenous people who really live in the United States are Indians. If someone wants to leave the United States, it is also white and black people who want to leave now.

If the white people don't leave, the black people can't leave.

Of course, the United States is now a multi-ethnic country, with blacks, whites, Asians, Latinos ..... and as long as they have American citizenship, they are all Americans.

So they are already in their own country, of course, there is no need to leave. This is also the main reason why blacks are so hard-core, demonstrating, smashing and looting.

The United States is an immigrant country, and both blacks and whites came to the United States in various ways. The contribution of blacks to American construction is actually equal to that of whites, not because they come from Africa, and the earliest black immigrants buried their contribution to the United States without education. This is also one of the important reasons why blacks are gradually recognized in American history. Black people in America have lived for hundreds of years, and America is their country. After two industrial revolutions, the United States gradually became stronger and accumulated a lot of wealth, and built the United States as a long-term settlement. So go back to a place where no ancestors built it with blood? Country? What's the use of being hungry? Waiting to be sold into slavery next time?

After the opening of the new air route, oppressed Christians in Europe immigrated to America to survive, and blacks were the first? Slave trade? They were sold in large numbers to the United States, and white people sent them to this undeveloped land. As a cheap labor force without human rights, although the status of blacks has gradually improved in American history, it is impossible to completely eliminate racial discrimination in less than 300 years after the founding of the United States. Racial discrimination or domination of other races by whites runs through the whole American history, but it is these phenomena that reflect the promotion of black rights. They are an important part of America, and America without black people is not America.