The Apache Software Foundation Announces Apache® Ranger™ as a Top-Level Project
Forest Hill, MD, Feb. 08, 2017 (GLOBE NEWSWIRE) — The #Apache Software Foundation (ASF), the all-volunteer developers, stewards, and incubators of more than 350 #OpenSource projects and initiatives, announced today that Apache® Ranger™ has graduated from the Apache Incubator to become a Top-Level Project (TLP), signifying that the project’s community and products have been well-governed under the ASF’s meritocratic process and principles. The latest addition to the ASF’s more than three dozen projects in Big Data, Apache Ranger is a centralized framework used to define, administer and manage security policies consistently across Apache Hadoop components. Ranger also offers the most comprehensive security coverage, with native support for numerous Apache projects, including Atlas (incubating), #HBase, #HDFS, #Hive, #Kafka, #Knox, #NiFi, #Solr, #Storm, and #YARN. “Graduating to a Top-Level Project reflects the maturity and growth of the Ranger Community,” said Selvamohan Neethiraj, Vice President of Apache Ranger. “We are pleased to celebrate a great milestone and officially play an integral role in the Apache Big Data ecosystem.” Apache Ranger provides a simple and effective way to set access control policies and audit the data access across the entire Hadoop stack by following industry best practices. One of the key benefits of Ranger is that access control policies can be managed by security administrators from a single place and consistently across hadoop ecosystem. Ranger also enables the community to add new systems for authorization even outside Hadoop ecosystem, with a robust plugin architecture, that can be extended with minimal effort. In addition, Apache Ranger provides many advanced features, such as: Ranger Key Management Service (compatible with Hadoop’s native KMS API to store and manage encryption keys for HDFS Transparent Data Encryption); Dynamic column masking and row filtering; Dynamic policy conditions (such as prohibition of toxic joins); User context enrichers (such as geo-location and time of day mappings); and Classification or tag based policies for Hadoop ecosystem components via integration with Apache Atlas.