Project Details » History » Revision 7
Revision 6 (WONGKAI Briana Monika Luckyta, 12/25/2025 04:20 PM) → Revision 7/10 (WONGKAI Briana Monika Luckyta, 01/06/2026 11:01 AM)
[[Wiki|Home]] | **[[Project Details|Project Details]]** | [[Group Members|Group Members]] | [[UML_Diagrams|UML Diagrams]] | [[Weekly Progress|Weekly Progress]] | [[Code|Code]] | [[Results|Results]]
---
> h1. Project Details
---
h2. I. Project Overview
This project addresses the challenge of producing a unified visual output from multiple projectors by developing a software-driven image composition system. The system combines two projected images into a single coherent display while minimizing visible boundaries, luminance variation, and color imbalance across overlapping regions.
The implementation is based on **Python** and the **OpenCV** framework. Computational image-processing techniques such as luminance normalization, transparency-based blending, and spatial intensity control are applied to correct projection inconsistencies caused by illumination differences and surface variation.
---
h2. II. Motivation and Problem Definition
Multi-projector systems commonly exhibit discontinuities in overlapping regions. Since projectors emit light, the overlapping region where two projectors meet receives double the light intensity Left + Right, resulting in a visible "bright band" or seam.
* **The Artifacts** : Visible seams, uneven brightness, and color distortion.
* **The Solution** : This project proposes an automated, software-based alternative that performs alignment and blending algorithmically, eliminating the need for expensive hardware blend units.
---
h2. III. System Capabilities
The system supports:
* **Automated Projection Blending** : Merges left and right images based on a configurable overlap width.
* **Luminance Normalization** : Corrects for the non-linear brightness output of projectors using Gamma correction.
* **Real-Time Processing** : Capable of processing image inputs efficiently using NumPy matrix operations.
* **Modular Architecture** : Separates configuration, logic, and display for maintainability.
---
h2. IV. Algorithms and Theoretical Framework
The system operates on a shared projection surface illuminated by synchronized projectors. To achieve a seamless blend, we implement two core mathematical adjustments: *Alpha Blending* and *Gamma Correction*
h3. *A. Alpha Blending (Transparency Control)*
Alpha blending merges two visual layers based on a transparency coefficient Alpha. We generate a gradient mask where the transparency of the left image fades from 1.0 to 0.0, and the right image fades from 0.0 to 1.0.
> p=. Linear Blending Formula:
p=. !Screenshot%202025-12-25%20at%2015.59.11.png!
h3. *B. Gamma Correction (Luminance Normalization)*
Standard linear blending fails because projectors are *non-linear devices*. A pixel value of 50% (128) does not result in 50% light output; due to the projector's gamma (approx. 2.2) it results in only ~22% light output. This causes the blended region to appear darker than the rest of the image (a "dark band").
To correct this, we apply an *Inverse Gamma* function to the blend mask before applying it to the image. This "boosts" the pixel values in the overlap region so that the final optical output is linear.
> p=. Gamma Correction Formula:
p=. !{width: 500px}clipboard-202512251559-mywr2.png!
By setting gamma to match the projector (typically 2.2), we ensure that: *Software Correction Projection Gamma = Linear Output*
p=. !clipboard-202512251605-hqoyj.png!
p=. *Figure 1.* The relationship between input pixel intensity and corrected output. The +orange line+ shows the software correction (gamma=3) boosting values to counteract the projector's drop (gamma ~2.2). The dashed grey line represents a linear response (no correction), while the blue and green lines represent under-correction (gamma=0.5) and over-correction (gamma=50) respectively.
---
h2. V. Experimental Validation: Gamma Analysis
To validate the necessity of Gamma Correction and verify our software's behavior, we performed a comparative analysis. First, we generated a computational plot of the blending mechanics, followed by physical projection tests using three distinct Gamma values to observe the real-world impact on luminance uniformity.
h3. *A. Blending Mechanics and Theoretical Boost*
To visualize how the inverse gamma correction modifies the standard linear fade, we generated a spatial intensity plot of the overlap region (Figure 1).
p=. !clipboard-202512251609-07rkw.png!
p=. *Figure 2.* Spatial intensity plot of the overlap region. The dashed lines represent a standard linear fade, which results in insufficient light output. The solid blue and red curves show the gamma-corrected output (boosting the mid-tones) applied by our software to compensate for projector non-linearity.
As illustrated in Figure 2, the solid curves bow upward across the overlap zone. This represents the mathematical "boost" applied to the pixel values. By increasing the intensity of the mid-tones before they reach the projector, we counteract the projector's physical tendency to dim those mid-tones, theoretically resulting in a linear, uniform light output.
h3. *B. Physical Projection Results*
We tested this theory physically by projecting a test image and varying the gamma parameter in the configuration.
*Case 1: Under-Correction (gamma = 0.5)*
Applying a gamma value below 1.0 results in a curve that bows downward, worsening the natural dimming effect of the projectors.
p=. !clipboard-202512251612-0ii60.jpeg! !clipboard-202512251613-szrss.png!
p=. *Figure 3.* Physical projection result using gamma = 0.5. A distinct "dark band" is visible in the center overlap region due to under-correction of luminance.
As seen in Figure 3, the overlap region is significantly darker than the non-overlapped areas. The software darkened the mid-tones too quickly, compounding the projector's natural light loss. This confirms that a concave (downward-bowing) blending curve is unsuitable for uniform projection blending.
*Case 2: Optimal Compensation (gamma = 3.0)*
Applying a gamma value near the industry standard for display devices (typically between 2.2 and 3.0) provides the necessary upward boost to the mid-tones.
p=. !clipboard-202512251615-u1uov.jpeg! !clipboard-202512251615-r2kcq.png!
p=. *Figure 4.* Physical projection result using gamma = 3.0. The blend is seamless, with uniform brightness achieved across the entire overlap zone.
Figure 4 demonstrates a successful blend. The luminance boost applied by the software effectively cancelled out the projector's physical gamma curve. The sum of the light intensities from both projectors produces a uniform brightness level, rendering the seam invisible to the naked eye.
*Case 3: Over-Correction (gamma = 50.0)*
Applying an extreme gamma value tests the limits of the algorithm. Mathematically, this creates a curve that jumps almost instantly from black to maximum brightness.
p=. !clipboard-202512251616-ks0i1.jpeg! !clipboard-202512251616-ym3xu.png!
p=. *Figure 5.* Physical projection result using gamma = 50.0. The gradient is destroyed, resulting in a hard, bright edge instead of a smooth transition.
As shown in Figure 5, extreme over-correction destroys the gradient necessary for a smooth transition. The overlap area becomes a uniform bright band with hard edges. This validates that while a luminance boost is necessary, the correction curve must be graduated to match the projector's response characteristics; an excessively steep curve eliminates the blending effect entirely.
---
h2. VI. Software Architecture
The system is structured into two four primary classes to ensure modularity and separation of concerns.
1. *ConfigReader*: Manages external configuration parameters (JSON) such as gamma values, screen side, and overlap width.
2. *Main_Alpha_Blendert*: *ProjectionSplit*: Performs the core mathematical operations. It generates the NumPy masks, applies the gamma power functions, and merges the alpha channels.
3. *VideoProcessing*: Handles frame extraction and resource management for video inputs.
4. *ImageDisplayApp*: Provides the Graphical User Interface (GUI) for runtime adjustments and full-screen visualization.
---
h2. VII. Functional Requirements
* **Input**: The system accepts standard image formats (JPG, PNG) and video streams.
* **Configuration**: Users must be able to adjust the gamma value and the image size through the config.json file. overlap_pixels via a configuration file or GUI.
* **Output**: The system must generate left and right specific images that, when projected physically, align to form a single continuous image.
---
h2. VIII. Development Tools
* **Python & OpenCV**: Used for matrix manipulation and image rendering.
* **NumPy**: Essential for performing the gamma power function on millions of pixels simultaneously for real-time performance.
* **Doxygen**: Used to generate automated technical documentation.