This document provides my comprehensive solutions and explanations for all tasks in the Aon Developer Pre-Interview Questionnaire.
If I act as the compiler, the given code above from the interview-questionaire will not compile successfully. The problem is that the decrement() method in the MyIncDec class is incomplete. Since the class implements the IncDec interface, it is required to provide full implementations of both increment() and decrement(). Because the method body is missing and the braces are unbalanced, the compiler would throw errors such as: '}' expected or reached end of file while parsing (because the method is not closed properly), and MyIncDec is not abstract and does not override abstract method decrement() in IncDec (because the method is not implemented correctly).
However, logically, I can identify the missing part. To make the solution correct, I would complete the decrement() method, provide the proper implementation, and create the necessary Java files for the solution.
Modify the existing IncDec interface implementation to measure and log the execution time (in milliseconds) for each call/invocation to the increment() and decrement() methods. Your solution should not require any code change in current classes that implement IncDec interface.
The key constraint is that we cannot modify existing IncDec implementations. This is a classic use case for the Proxy Design Pattern.
1. High Precision Timing:
Rationale: System.nanoTime() provides the highest precision available in Java, more accurate than System.currentTimeMillis() for measuring short execution times.
2. Exception Safety with try-finally:
Rationale: Ensures timing measurements are always logged, even if the wrapped method throws an exception.
3. Instance Identification:
Rationale: Distinguishes between different proxy instances in logs, essential for debugging in multi-instance scenarios.
Imagine you need to do something similar (timing of method calls) for many places in an application. What other ideas come to mind, e.g. could you propose other ideas for getting timing information more conveniently?
In my Spring Boot projects, I use AOP with annotations (e.g., @Timed) to capture method execution times. This keeps timing logic centralized and avoids scattering code across services, especially useful in large enterprise applications with many APIs.
While less common in my daily work, Java agents and tools like Javassist can help monitor legacy or third-party libraries without code changes. This is valuable in enterprise systems that I’ve integrated with where direct modification wasn’t possible.
On the backend, I’ve built custom annotations to measure execution times at compile time. On the frontend, Angular also supports decorators, which I can extend to log performance in components or services. This keeps monitoring type-safe and IDE-friendly.
In Spring Boot, I’ve used interceptors and dynamic proxies to add cross-cutting concerns like logging and timing at runtime. In Angular, I use HTTP Interceptors to capture request/response times, giving clear visibility into API performance end-to-end.
For backend analysis, I’ve worked with JVM profilers and monitoring tools to track memory and execution bottlenecks. On the frontend, I rely on Angular DevTools and Chrome Lighthouse for runtime profiling of UI rendering and API call performance.
I’ve used Micrometer in Spring Boot microservices to collect metrics and expose them to Prometheus and Grafana dashboards. This aligns with my experience in CI/CD pipelines and cloud deployments, ensuring that both backend and frontend services are observable in production.
Implement a method that given two arrays as parameters will find the starting index where the second parameter occurs as a sub-array in the array given as the first parameter. Then implement unit tests for this method.
Example: [4,9,3,7,8], [3,7] should return 2.
This is a classic sub-array (or substring) matching problem. I implemented two approaches:
When we see O(n * m) in complexity analysis:
n = length of the main array (the bigger dataset).
m = length of the sub-array (the smaller dataset we’re comparing, searching, or iterating over).
O(n * m) means the runtime grows in proportion to both n and m multiplied together.
Time Complexity: O(n*m) where n = main array length, m = sub-array length
Space Complexity: O(1)
Approach: Check each possible starting position in the main array
Time Complexity: O(n+m) - more efficient for large arrays
Space Complexity: O(m) for the partial match table
Approach: Uses a partial match table to skip unnecessary comparisons
Comprehensive unit tests covering:
1. String Comparison Bug (Line 3):
Problem: Using == for string comparison instead of .equals()
Issue: == compares object references, not string content
Fix: Change to student.getTeacherName().equals("Lee")
Additional consideration: Add null safety: "Lee".equals(student.getTeacherName())
2. Null Pointer Risk (Line 2, 3, 4, 5):
Problem: No null checks for student parameter or method return values
Risk: NullPointerException if student is null or if any getter returns null
Fix: Add proper null validation at the beginning of the method
3. Deep Nesting (Lines 2-9):
Problem: Four levels of nested if statements reduce readability
Impact: Difficult to understand, maintain, and test
Fix: Use guard clauses or combine conditions with && operator
4. Missing Braces (Line 5-6):
Problem: if statement without braces is error-prone
Risk: Future modifications might introduce bugs
Fix: Always use braces for control structures
5. Unclear Method Purpose:
Problem: Method name "checkStudy" doesn't clearly indicate what it returns
Impact: Other developers need to read implementation to understand behavior
Suggestion: Rename to something like "isEligibleForAdvancedStudy" or add comprehensive JavaDoc
Describe a trend which has happened with enterprise applications over the last two or three years. How do you see this trend influencing corporate application development? What are you doing to prepare for this trend?
Trend: Cloud-Native Microservices and API-Driven Frontends
In recent years, enterprise applications have moved from monolithic systems to microservices with containerization and API-first designs. Java Spring Boot is widely used for backend services, while Angular and other modern frameworks consume APIs to deliver responsive, scalable user experiences. This shift is driven by the need for faster delivery, high availability, and distributed teams working in Agile environments.
Influence on Corporate Application Development:
My Preparation for This Trend:
This trend reshapes enterprise delivery by requiring developers to master both backend microservices and frontend integration, ensuring applications remain scalable, secure, and user-focused.
Describe a product or project you worked on that delivered high value to the user. Which specific aspects did you think were critical in successfully creating value for the user?
Project: RelayDoc – Automated Railway Relay Testing System
I contributed to RelayDoc, a Java-based system that automated testing of railway relays. It measured coil resistance, contact setup, and other parameters—ensuring safety, accuracy, and compliance while eliminating manual errors.
High Value Delivered:
Critical Success Aspects:
1. Spring Boot Backend: Multithreaded design for fast, accurate processing.
2. API & Database Integration: MongoDB storage with Angular dashboards.
3. Agile Collaboration: Delivered iteratively with QA and frontend teams.
4. Scalable Design: Applied SOLID principles and clean architecture.
5. User Feedback: Engineer input shaped dashboards and recovery features.
By combining Java, Angular, and Agile, RelayDoc delivered safer, faster, and more reliable railway operations.
What are your core values/principals as a software engineering professional working in a team environment and why are they important to you?
My Core Software Engineering Values:
1. Clean and Readable Code
Whether I’m writing Java Spring Boot APIs or Angular components, I treat code as communication. Clear naming, modular design, and consistent patterns make the system easy for teammates to understand, maintain, and extend.
2. Test Early, Fix Early
I believe in failing fast by using unit tests, integration tests, and code reviews. In both backend APIs and Angular frontends, catching issues early keeps delivery reliable and builds team trust.
3. User-First Decisions
Every feature should serve the end user. I balance technical design with business value, ensuring APIs are performant and Angular UIs are intuitive and responsive.
4. Shared Ownership
I see the entire system—backend and frontend—as the team’s responsibility. I actively review pull requests, share knowledge, and support teammates so no part of the system becomes a silo.
5. Continuous Learning
Technology evolves quickly, so I stay current with Java, Angular, DevOps tools, and design patterns. I also mentor juniors, which strengthens both my growth and the team’s capabilities.
These values shape how I collaborate and deliver. They ensure that our software is not only technically sound but also valuable, scalable, and sustainable for the business and end users.
Note:
All solutions are my preferred approaches, implemented with comprehensive error handling,
detailed comments, and thorough testing to demonstrate professional software development practices. -- Christopher Natan