What is the meaning of Naturism?

The belief in or practice of going nude in social settings, often in mixed-gender groups, specifically either in cultures where this is not the norm or for health reasons.

The worship of the powers of nature.

naturism

naturism (lifestyle of practicing social nudity)

Source: wiktionary.org