About the job Intermediate BI Data Analyst
Typically, generate insight and logic through our unique informal market data asset whilst establishing and implementing the analytical models required to enrich and automate our insights that improve our responsiveness and understanding of the data. To establish deep routed understanding in the data to drive strategy that fundamentally overhauls our decision-making processes based on logic and insight. Aligning our own data with external data sets that can enrich our understanding of the data is also required to draw correlations and market insights that are predicatable. Through this understanding, commodify the data into easy-to-understand reports and dashboards aligned to our clients needs.
Salary: R40 000-R60 000
Location: Gauteng, South Africa, remote!!
Key Performance Areas:
Objectives of this role
- Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes.
- Help streamline our data science and analytics workflows by improving data delivery and quality to internal and external stakeholders.
- Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning.
- Be an advocate for best practices and continued learning, constantly challenging the status quo and striving for excellence.
Responsibilities
- Work closely with our development team, data analysts and BI analysts to help build and maintain data flows that support our reporting requirements.
- Use agile software development processes to make iterative improvements to our back-end systems, particularly our reporting DB.
- Model front-end and back-end data sources to help draw a more comprehensive picture of user flows throughout the system and to enable powerful data analysis.
- Build data pipelines that clean, transform, and aggregate data from disparate sources and deliver quality usable data to data analysts and BI analysts for reporting.
- Develop models that can be used to make predictions and answer questions for the overall business.
Data Processing & Management
- Gathering/Extracting data from the database for analysis.
- Cleaning and preparation of data for analysis.
- Quality controlling our data processeses and tables.
- Identify, analyse, and interpretation of market data for clients.
- Producing accurate BI reports & Dashboards, within agreed upon timeframes, which are:
- Data valuable
- Understandable to both a technical and non-technical audience
- Insightful and supports quality decision making for internal and external stakeholders
- Create project tracking reporting for all stakeholders.
- Setting up processes and systems to make working with data more efficient.
- Exploring and interpreting data to identify trends and opportunities for business improvement.
- Identifying data shortcomings and alerts within campaigns to drive proactive responses.
- Provide recommendations on campaign performance improvements, based on client data and industry standards.
- Researching new ways to make use of data, to improve business performance.
- Managing the team to ensure that quality deliverables are produced timeously.
Stakeholder Engagement
- Responding timeously to data-related requests and queries and keeping track of all requests.
- Collaborating with key stakeholders on all aspects of report creation, incl. deadlines, deliverables, edits, recommendations.
- Scoping out the required data models required to drive efficiency with stakeholder buy-in.
- Attending meetings with internal & external stakeholders to ensure understanding of the project data and requirements.
- Presenting information and communicating findings generated from data to stakeholders, suited to a technical and non-technical audience.
Required skills and qualifications
- Bachelors degree in computer science, information technology, engineering, or related analytical discipline required.
- Three or more years of experience with Python, SQL, and data visualization/exploration tools (PowerBI, Tableau etc).
- Familiarity with common python based ETL tools such as PySpark or Apache Airflow.
- Familiarity with Kimball & Inmon data warehousing approaches.
- Familiarity with the AWS ecosystem, specifically Redshift, RDS & EC2.
- Familiarity with PostgreSQL preferred.
- Communication skills, especially for explaining technical concepts to nontechnical business leaders.
- Ability to work on a dynamic, results-oriented team that has concurrent projects and priorities.