Project

General

Profile

Project Details » History » Version 6

WONGKAI Briana Monika Luckyta, 12/25/2025 04:20 PM

1 5 WONGKAI Briana Monika Luckyta
[[Wiki|Home]] |  **[[Project Details|Project Details]]**  |  [[Group Members|Group Members]]  |  [[UML_Diagrams|UML Diagrams]]  |  [[Weekly Progress|Weekly Progress]]  |  [[Code|Code]]  |  [[Results|Results]]
2 2 WONGKAI Briana Monika Luckyta
3
---
4
5
> h1. Project Details
6 1 WONGKAI Briana Monika Luckyta
7
---
8 6 WONGKAI Briana Monika Luckyta
9
h2. I. Project Overview
10
11
This project addresses the challenge of producing a unified visual output from multiple projectors by developing a software-driven image composition system. The system combines two projected images into a single coherent display while minimizing visible boundaries, luminance variation, and color imbalance across overlapping regions.
12
13
The implementation is based on **Python** and the **OpenCV** framework. Computational image-processing techniques such as luminance normalization, transparency-based blending, and spatial intensity control are applied to correct projection inconsistencies caused by illumination differences and surface variation.
14
15
---
16
17
h2. II. Motivation and Problem Definition
18
19
Multi-projector systems commonly exhibit discontinuities in overlapping regions. Since projectors emit light, the overlapping region where two projectors meet receives double the light intensity Left + Right, resulting in a visible "bright band" or seam.
20
* **The Artifacts** : Visible seams, uneven brightness, and color distortion.
21
* **The Solution** : This project proposes an automated, software-based alternative that performs alignment and blending algorithmically, eliminating the need for expensive hardware blend units.
22
23
---
24
25
h2. III. System Capabilities
26
27
The system supports:
28
* **Automated Projection Blending** : Merges left and right images based on a configurable overlap width.
29
* **Luminance Normalization** : Corrects for the non-linear brightness output of projectors using Gamma correction.
30
* **Real-Time Processing** : Capable of processing image inputs efficiently using NumPy matrix operations.
31
* **Modular Architecture** : Separates configuration, logic, and display for maintainability.
32
33
---
34
35
h2. IV. Algorithms and Theoretical Framework
36
37
The system operates on a shared projection surface illuminated by synchronized projectors. To achieve a seamless blend, we implement two core mathematical adjustments: *Alpha Blending* and *Gamma Correction*
38
39
h3. *A. Alpha Blending (Transparency Control)*
40
41
Alpha blending merges two visual layers based on a transparency coefficient Alpha. We generate a gradient mask where the transparency of the left image fades from 1.0 to 0.0, and the right image fades from 0.0 to 1.0.
42
43
> p=. Linear Blending Formula:
44
p=. !Screenshot%202025-12-25%20at%2015.59.11.png!
45
46
h3. *B. Gamma Correction (Luminance Normalization)*
47
48
Standard linear blending fails because projectors are *non-linear devices*. A pixel value of 50% (128) does not result in 50% light output; due to the projector's gamma (approx. 2.2) it results in only ~22% light output. This causes the blended region to appear darker than the rest of the image (a "dark band").
49
50
To correct this, we apply an *Inverse Gamma* function to the blend mask before applying it to the image. This "boosts" the pixel values in the overlap region so that the final optical output is linear.
51
52
> p=. Gamma Correction Formula: 
53
p=. !{width: 500px}clipboard-202512251559-mywr2.png!
54
55
By setting gamma to match the projector (typically 2.2), we ensure that: *Software Correction Projection Gamma = Linear Output*
56
57
p=. !clipboard-202512251605-hqoyj.png!
58
59
p=. *Figure 1.* The relationship between input pixel intensity and corrected output. The +orange line+ shows the software correction (gamma=3) boosting values to counteract the projector's drop (gamma ~2.2). The dashed grey line represents a linear response (no correction), while the blue and green lines represent under-correction (gamma=0.5) and over-correction (gamma=50) respectively.
60
61
---
62
63
h2. V. Experimental Validation: Gamma Analysis 
64
65
To validate the necessity of Gamma Correction and verify our software's behavior, we performed a comparative analysis. First, we generated a computational plot of the blending mechanics, followed by physical projection tests using three distinct Gamma values to observe the real-world impact on luminance uniformity.
66
67
h3. *A. Blending Mechanics and Theoretical Boost*
68
69
To visualize how the inverse gamma correction modifies the standard linear fade, we generated a spatial intensity plot of the overlap region (Figure 1).
70
71
p=. !clipboard-202512251609-07rkw.png!
72
73
p=. *Figure 2.* Spatial intensity plot of the overlap region. The dashed lines represent a standard linear fade, which results in insufficient light output. The solid blue and red curves show the gamma-corrected output (boosting the mid-tones) applied by our software to compensate for projector non-linearity.
74
75
As illustrated in Figure 2, the solid curves bow upward across the overlap zone. This represents the mathematical "boost" applied to the pixel values. By increasing the intensity of the mid-tones before they reach the projector, we counteract the projector's physical tendency to dim those mid-tones, theoretically resulting in a linear, uniform light output.
76
77
h3. *B. Physical Projection Results*
78
79
We tested this theory physically by projecting a test image and varying the gamma parameter in the configuration.
80
81
*Case 1: Under-Correction (gamma = 0.5)*
82
Applying a gamma value below 1.0 results in a curve that bows downward, worsening the natural dimming effect of the projectors.
83
84
p=. !clipboard-202512251612-0ii60.jpeg! !clipboard-202512251613-szrss.png!
85
86
p=. *Figure 3.* Physical projection result using gamma = 0.5. A distinct "dark band" is visible in the center overlap region due to under-correction of luminance.
87
88
As seen in Figure 3, the overlap region is significantly darker than the non-overlapped areas. The software darkened the mid-tones too quickly, compounding the projector's natural light loss. This confirms that a concave (downward-bowing) blending curve is unsuitable for uniform projection blending.
89
90
*Case 2: Optimal Compensation (gamma = 3.0)*
91
Applying a gamma value near the industry standard for display devices (typically between 2.2 and 3.0) provides the necessary upward boost to the mid-tones.
92
93
p=. !clipboard-202512251615-u1uov.jpeg! !clipboard-202512251615-r2kcq.png!
94
95
p=. *Figure 4.* Physical projection result using gamma = 3.0. The blend is seamless, with uniform brightness achieved across the entire overlap zone.
96
97
Figure 4 demonstrates a successful blend. The luminance boost applied by the software effectively cancelled out the projector's physical gamma curve. The sum of the light intensities from both projectors produces a uniform brightness level, rendering the seam invisible to the naked eye.
98
99
*Case 3: Over-Correction (gamma = 50.0)*
100
Applying an extreme gamma value tests the limits of the algorithm. Mathematically, this creates a curve that jumps almost instantly from black to maximum brightness.
101
102
p=. !clipboard-202512251616-ks0i1.jpeg! !clipboard-202512251616-ym3xu.png!
103
104
p=. *Figure 5.* Physical projection result using gamma = 50.0. The gradient is destroyed, resulting in a hard, bright edge instead of a smooth transition.
105
106
As shown in Figure 5, extreme over-correction destroys the gradient necessary for a smooth transition. The overlap area becomes a uniform bright band with hard edges. This validates that while a luminance boost is necessary, the correction curve must be graduated to match the projector's response characteristics; an excessively steep curve eliminates the blending effect entirely.
107
108
---
109
110
h2. VI. Software Architecture
111
112
The system is structured into four primary classes to ensure modularity and separation of concerns.
113
114
1. *ConfigReader*: Manages external configuration parameters (JSON) such as gamma values, screen side, and overlap width.
115
2. *ProjectionSplit*: Performs the core mathematical operations. It generates the NumPy masks, applies the gamma power functions, and merges the alpha channels.
116
3. *VideoProcessing*: Handles frame extraction and resource management for video inputs.
117
4. *ImageDisplayApp*: Provides the Graphical User Interface (GUI) for runtime adjustments and full-screen visualization.
118
119
---
120
121
h2. VII. Functional Requirements
122
123
* **Input**: The system accepts standard image formats (JPG, PNG) and video streams.
124
* **Configuration**: Users must be able to adjust the gamma and overlap_pixels via a configuration file or GUI.
125
* **Output**: The system must generate left and right specific images that, when projected physically, align to form a single continuous image.
126
127
---
128
129
h2. VIII. Development Tools
130
131
* **Python & OpenCV**: Used for matrix manipulation and image rendering.
132
* **NumPy**: Essential for performing the gamma power function on millions of pixels simultaneously for real-time performance.
133
* **Doxygen**: Used to generate automated technical documentation.