EG
Permit System
During my time at EG, I collected and analysed user metrics to inform the development of the permit system. Working within the daily sprint workflow, I ensured research insights directly supported each phase of development. Findings were delivered through presentations and shared with the development team for immediate implementation.
Concept
What I Worked With
While working in Aalborg, I assisted EG with the development of a Permit System, a digital platform that streamlines the process of obtaining environmental permits for businesses and local governments, such as mining permits. The system replaces outdated paper-based processes with an intuitive online solution, making applications faster, improving transparency, and enabling smoother collaboration between applicants and regulators.
The guided application portal walks users through each step, reducing errors and rejections with clear instructions and instant feedback. A central dashboard provides real-time updates on permit status, while built-in collaboration tools let caseworkers and applicants communicate directly, avoiding unnecessary delays. To help users navigate regulations confidently, the system also includes an interactive learning toolkit with FAQs, policy examples, and training modules.
Designed based on real user feedback, something that I worked with acquiring, the EG Permit System prioritises clarity, accessibility, and sustainability. The result is a more efficient process that benefits businesses, local authorities, and the environment by cutting red tape and speeding up approvals for eco-friendly projects.
Development
The Team
To ensure alignment between user research and development, our team adopted an Agile workflow with structured touchpoints for feedback and iteration. Here’s how we worked:
Sprint Planning & Research Integration
At the start of each sprint, we defined research goals that directly informed the upcoming development phase—whether exploring pain points in permit applications or testing prototype improvements. Research tasks were treated as sprint backlog items, ensuring findings were actionable and timely.
Daily Standups for Cross-Disciplinary Alignment
During 15-minute daily scrums, I shared updates on ongoing research (e.g., interview progress, usability test recruitment) while developers and designers flagged questions needing user insights. This kept the team synchronized and allowed quick pivots—like adjusting a prototype based on a recurring user complaint.
Weekly Research Check-Ins
Every Thursday, I led a dedicated 60-minute session to:
-
Present raw findings (e.g., interview quotes, usability videos) to the full team.
-
Co-interpret data with developers/designers to prioritize fixes (e.g., “Users missed the document upload button—let’s increase its visual weight”).
-
Plan validation tests for the next sprint. These meetings turned research into shared ownership, not just a “report.”
Stakeholder Showcases
At sprint reviews, I distilled findings into bite-sized insights (e.g., “3/5 small businesses abandoned the form at Section 2”) paired with recommendations (e.g., “Break Section 2 into substeps with progress indicators”). This kept stakeholders engaged and invested in user-centered decisions.
Presenting Findings
I made sure my findings were clear and practical. Instead of sugarcoating things, I showed raw quotes and video clips of users struggling with the system. When explaining pain points, I kept it simple, for example, “4 out of 5 applicants got stuck on this step.” I connected these frustrations directly to design flaws so the team could see exactly what needed fixing.
To spot trends, I organised patterns on whiteboards with sticky notes. Every key insight led to a straightforward recommendation. I wrapped up each session with clear next steps, so no one left wondering what to do. My goal was simple: make the users’ problems impossible to ignore. No fluff, no vague feedback—just real issues that needed solving.