Skip to main content
Главная страница » Football » Cong An Ha Noi (International)

Cong An Ha Noi: Premier League Stars, Stats & Achievements

Overview of Cong An Ha Noi Football Team

Cong An Ha Noi, a prominent football team based in Hanoi, Vietnam, competes in the Vietnamese V.League 1. Known for their dynamic play and strategic depth, they are coached by a seasoned manager who has significantly shaped their current tactics. Founded in the mid-20th century, Cong An Ha Noi has established itself as a formidable force within the league.

Team History and Achievements

Over the years, Cong An Ha Noi has amassed several titles and awards, marking them as one of the top teams in Vietnamese football history. They have consistently secured top positions in the league standings and have had numerous notable seasons where they’ve outperformed expectations.

Current Squad and Key Players

The current squad boasts a mix of experienced veterans and promising young talents. Key players include their star striker, known for his goal-scoring prowess, and a midfield maestro who orchestrates plays with precision. The team’s defensive line is anchored by a standout center-back renowned for his leadership on the field.

Team Playing Style and Tactics

Cong An Ha Noi typically employs a 4-3-3 formation, focusing on high pressing and quick transitions. Their strategy emphasizes ball control and tactical flexibility, allowing them to adapt to various opponents. While their strengths lie in offensive play and teamwork, they occasionally struggle with maintaining defensive solidity under pressure.

Interesting Facts and Unique Traits

Fans affectionately refer to Cong An Ha Noi as “The Guardians,” reflecting their defensive prowess. The team has a passionate fanbase that supports them through thick and thin. Rivalries with other top teams add excitement to their matches, while traditions like pre-game rituals contribute to their unique identity.

Lists & Rankings of Players, Stats, or Performance Metrics

  • Top Scorer: ✅ Star striker with 15 goals this season
  • MVP: 💡 Midfielder with exceptional playmaking abilities
  • Potential Rising Star: 🎰 Young defender showing great promise

Comparisons with Other Teams in the League or Division

Cong An Ha Noi often draws comparisons with other leading teams due to their consistent performance. While some rivals excel in defense or attacking flair, Cong An Ha Noi’s balanced approach makes them a versatile competitor capable of challenging any opponent.

Case Studies or Notable Matches

A breakthrough game for Cong An Ha Noi was their stunning victory against a top-tier rival last season. This match showcased their tactical acumen and resilience, solidifying their reputation as underdogs capable of delivering remarkable performances.

Tables Summarizing Team Stats, Recent Form, Head-to-Head Records or Odds

Statistic Data
Last 5 Matches Form W-W-D-L-W
Head-to-Head Record vs Rivals W: 12 L: 8 D: 5
Odds for Next Match Win/Loss/Draw Win: 1.75 Loss: 3.50 Draw: 3.25

Tips & Recommendations for Analyzing the Team or Betting Insights 💡 Advice Blocks

  • Analyze recent form trends to gauge momentum before placing bets.
  • Closely monitor key player injuries as they can significantly impact performance.
  • Leverage head-to-head records to understand how Cong An Ha Noi performs against specific opponents.

Quotes or Expert Opinions about the Team (Quote Block)

“Cong An Ha Noi’s ability to adapt tactically is unmatched in the league,” says an expert analyst from a leading sports network.

Pros & Cons of the Team’s Current Form or Performance (✅❌ Lists)

  • ✅ Strong attacking options that consistently threaten defenses.
  • ❌ Occasional lapses in concentration lead to defensive vulnerabilities.

Frequently Asked Questions (FAQ) Block about Betting on Cong An Ha Noi emreozdemir/ML-Clustering<|file_sep|RFID-Based Human Activity Recognition Using Machine Learning Techniques ================ ## Abstract Human activity recognition (HAR) has become an important research area over the past decade due to its wide range of applications including health care, smart homes automation etc.. HAR systems can be broadly categorized into three categories based on sensor technology used; wearable sensors such as accelerometers, inertial measurement units (IMU), gyroscopes etc., environmental sensors such as cameras and microphones which can be installed at home/office/workplace, and ambient sensors such as radio frequency identification (RFID) tags which can be attached to objects at home/office/workplace. In this study we investigate RFID-based human activity recognition using machine learning techniques by developing an indoor localization system using active RFID tags called Ubitags deployed at our laboratory environment. We develop different feature extraction algorithms from raw location data obtained from Ubitags using time series analysis methods such as Fourier transform, wavelet transform etc., We also extract features from raw data using statistical techniques such as mean/variance calculations etc., We compare different classification algorithms including k-nearest neighbor classifier (kNN), decision tree classifier, support vector machines (SVM) etc., Finally we perform feature selection using a variety of feature selection algorithms including principal component analysis (PCA), information gain ratio based selection algorithm etc., Our results show that it is possible to achieve more than ~85% accuracy using machine learning techniques. ## Introduction Human activity recognition (HAR) is an emerging research area which has attracted considerable attention over recent years due its wide range of applications including healthcare monitoring [1], smart home automation [4], security surveillance [5], smart office environments [6], elderly care [7] etc.. HAR systems can be broadly categorized into three categories based on sensor technology used; wearable sensors such as accelerometers [8], inertial measurement units (IMU) [9], gyroscopes [10] etc., environmental sensors such as cameras [11] and microphones which can be installed at home/office/workplace, and ambient sensors such as radio frequency identification (RFID) tags which can be attached to objects at home/office/workplace. Wearable sensors provide accurate measurements but require users to wear sensor devices throughout day which may not always be feasible especially when users are engaged in daily activities where wearing additional devices may not be comfortable. Environmental sensors provide non-invasive measurements but usually require installation at user’s living space which may not always be possible especially when users do not want to install devices at home/offices/workplaces. Ambient sensors such as RFID tags provide non-invasive measurements without requiring users to wear any device but are usually limited by low spatial resolution since each RFID tag provides only one point location information. In this study we investigate RFID-based human activity recognition using machine learning techniques by developing an indoor localization system using active RFID tags called Ubitags deployed at our laboratory environment. We develop different feature extraction algorithms from raw location data obtained from Ubitags using time series analysis methods such as Fourier transform, wavelet transform etc., We also extract features from raw data using statistical techniques such as mean/variance calculations etc., We compare different classification algorithms including k-nearest neighbor classifier (kNN), decision tree classifier, support vector machines (SVM) etc., Finally we perform feature selection using a variety of feature selection algorithms including principal component analysis (PCA), information gain ratio based selection algorithm etc., Our results show that it is possible to achieve more than ~85% accuracy using machine learning techniques. ## Related Work Human activity recognition systems can be broadly categorized into three categories based on sensor technology used; wearable sensors such as accelerometers [8], inertial measurement units (IMU) [9], gyroscopes [10] etc., environmental sensors such as cameras [11] and microphones which can be installed at home/office/workplace, and ambient sensors such as radio frequency identification (RFID) tags which can be attached to objects at home/office/workplace. ### Wearable Sensors Wearable sensor-based HAR systems use body-worn devices containing accelerometer/magnetometer/GPS modules for recognizing human activities. Accelerometer-based HAR systems have been extensively studied over recent years. Liu et al.[12] developed an accelerometer-based human activity recognition system that uses random forests algorithm for classification achieving around ~90% accuracy. Chen et al.[13] proposed an accelerometer-based human activity recognition system that uses support vector machines achieving around ~96% accuracy. Almalkawi et al.[14] proposed an accelerometer-based HAR system that uses both random forest classifier and support vector machines achieving around ~98% accuracy. Gyroscope-based HAR systems have been less studied compared with accelerometer-based systems. Liu et al.[15] proposed a gyroscope-based HAR system that uses support vector machines achieving around ~94% accuracy. ### Environmental Sensors Environmental sensor-based HAR systems use camera/microphone modules installed at user’s living space for recognizing human activities. Camera-based HAR systems have been extensively studied over recent years. Zhang et al.[16] developed camera-based human activity recognition system that uses convolution neural networks achieving around ~97% accuracy. Microphone-based HAR systems have been less studied compared with camera-based systems. Gupta et al.[17] proposed microphone-based human activity recognition system that uses hidden Markov models achieving around ~92% accuracy. ### Ambient Sensors Ambient sensor-based HAR systems use RFID tags attached to objects at user’s living space for recognizing human activities. RFID tag based HAR systems have been less studied compared with wearable/environmental sensor based systems mainly because of low spatial resolution provided by each tag since each tag provides only one point location information. Baeza-Rivera et al.[18] proposed an indoor localization system using active RFID tags called Ubitags deployed at user’s living space achieving around ~80% localization accuracy. ## System Overview In this study we investigate RFID-based human activity recognition using machine learning techniques by developing an indoor localization system using active RFID tags called Ubitags deployed at our laboratory environment. ### System Architecture ![System Architecture](system_architecture.png) ### Hardware Setup ![Hardware Setup](hardware_setup.png) ## Experimental Results In this section we present experimental results obtained from our experiments conducted over two weeks period where volunteers were asked to perform daily activities inside our laboratory environment containing multiple rooms equipped with Ubitags. ### Feature Extraction Algorithms #### Time Series Analysis Methods ##### Fourier Transform ##### Wavelet Transform #### Statistical Methods ##### Mean Calculation ##### Variance Calculation ### Classification Algorithms #### K-nearest Neighbor Classifier #### Decision Tree Classifier #### Support Vector Machines ### Feature Selection Algorithms #### Principal Component Analysis #### Information Gain Ratio Based Selection Algorithm ## Conclusions In conclusion we investigated RFID-based human activity recognition using machine learning techniques by developing an indoor localization system using active RFID tags called Ubitags deployed at our laboratory environment.We developed different feature extraction algorithms from raw location data obtained from Ubitags using time series analysis methods such as Fourier transform,wavelet transform etc.,We also extracted features from raw data using statistical techniques such as mean/variance calculations etc.,We compared different classification algorithms including k-nearest neighbor classifier(kNN),decision tree classifier,support vector machines(SVM)etc.,Finally we performed feature selection using a variety of feature selection algorithms including principal component analysis(PCA),information gain ratio based selection algorithmetc.,Our results show that it is possible to achieve more than ~85%accuracyusing machine learning techniques.<|file_sep**Introduction** The objective of this project was creating supervised learning model able recognize humans' movement patterns inside buildings just relying on radio frequency identification technology instead common approaches like GPS positioning or motion detectors usage inside room corners. **Hardware setup** Radio Frequency Identification(RFID): Radio Frequency Identification(RFID) is wireless communication method between two entities – reader & tag – operating via electromagnetic fields instead conventional radio waves usage like Bluetooth/Wifi technologies do it; reader sends signal towards surrounding area expecting some response back if there exists any object possessing antenna within range capable receiving said signal then transmitting reply message containing unique ID number assigned upon manufacturing process along with other relevant information regarding its position/location relative reader device; these responses allow readers identify presence/absence particular tagged items without needing line-of-sight connection between them unlike barcodes QR codes optical character recognition(OCR). **Software setup** Python programming language was used for implementing all parts related data processing tasks involved during development phase; specifically pandas library proved very useful when dealing large datasets containing millions rows/columns thanks its efficient handling capabilities offered via built-in functions like groupby(),merge()etc.; additionally numpy module allowed us perform mathematical operations efficiently utilizing multi-dimensional arrays structure supported natively within Python ecosystem thus avoiding overhead associated traditional looping constructs commonly found languages like C++Javaetc.. **Experimental Results** To evaluate effectiveness proposed approach several experiments were conducted involving volunteers performing daily activities inside lab equipped multiple rooms outfitted ubitags(see hardware setup section). Data collected during these experiments were processed through various stages starting from initial preprocessing steps involving filtering out irrelevant signals caused interference/noise present environment followed by applying feature extraction algorithms aimed extracting meaningful patterns hidden beneath surface level observations captured raw sensor readings; finally classification models trained/tested against labeled dataset generated earlier phases allowing us determine overall performance achieved solution implemented herein terms accuracy precision recall F1-score metrics commonly used assess quality predictive models across diverse domains ranging natural language processing(NLP)image recognitioncomputer visionetc… **Conclusions** Results obtained demonstrate feasibility implementing effective supervised learning model capable recognizing humans' movement patterns solely relying upon radio frequency identification technology without necessitating additional equipment like GPS positioning motion detectors placement room cornersetc… Achieved accuracies exceeded threshold values typically expected similar tasks suggesting potential applicability real-world scenarios e.g smart homes energy management building security surveillance healthcare monitoring rehabilitation roboticsetc… Overall experience working project provided valuable insights challenges encountered developing robust reliable solutions leveraging cutting-edge technologies emerging trends artificial intelligence(AI)/machine learning(ML)fields today.emreozdemir/ML-Clustering<|file_sep***Objective*** The objective of this project was creating supervised learning model able recognize humans' movement patterns inside buildings just relying on radio frequency identification technology instead common approaches like GPS positioning or motion detectors usage inside room corners. ***Hardware Setup*** Radio Frequency Identification(RFID): Radio Frequency Identification(RFID) is wireless communication method between two entities – reader & tag – operating via electromagnetic fields instead conventional radio waves usage like Bluetooth/Wifi technologies do it; reader sends signal towards surrounding area expecting some response back if there exists any object possessing antenna within range capable receiving said signal then transmitting reply message containing unique ID number assigned upon manufacturing process along with other relevant information regarding its position/location relative reader device; these responses allow readers identify presence/absence particular tagged items without needing line-of-sight connection between them unlike barcodes QR codes optical character recognition(OCR). ***Software Setup*** Python programming language was used for implementing all parts related data processing tasks involved during development phase; specifically pandas library proved very useful when dealing large datasets containing millions rows/columns thanks its efficient handling capabilities offered via built-in functions like groupby(),merge()etc.; additionally numpy module allowed us perform mathematical operations efficiently utilizing multi-dimensional arrays structure supported natively within Python ecosystem thus avoiding overhead associated traditional looping constructs commonly found languages like C++Javaetc.. ***Experimental Results*** To evaluate effectiveness proposed approach several experiments were conducted involving volunteers performing daily activities inside lab equipped multiple rooms outfitted ubitags(see hardware setup section). Data collected during these experiments were processed through various stages starting from initial preprocessing steps involving filtering out irrelevant signals caused interference/noise present environment followed by applying feature extraction algorithms aimed extracting meaningful patterns hidden beneath surface level observations captured raw sensor readings; finally classification models trained/tested against labeled dataset generated earlier phases allowing us determine overall performance achieved solution implemented herein terms accuracy precision recall F1-score metrics commonly used assess quality predictive models across diverse domains ranging natural language processing(NLP)image recognitioncomputer visionetc… ***Conclusions*** Results obtained demonstrate feasibility implementing effective supervised learning model capable recognizing humans' movement patterns solely relying upon radio frequency identification technology without necessitating additional equipment like GPS positioning motion detectors placement room cornersetc… Achieved accuracies exceeded threshold values typically expected similar tasks suggesting potential applicability real-world scenarios e.g smart homes energy management building security surveillance healthcare monitoring rehabilitation roboticsetc… Overall experience working project provided valuable insights challenges encountered developing robust reliable solutions leveraging cutting-edge technologies emerging trends artificial intelligence(AI)/machine learning(ML)fields today.emreozdemir/ML-Clustering<|file_sep| $$ begin{bmatrix} frac{d}{dt}x_1(t)=& x_1(t)+x_3(t)-x_4(t)+sin(t)cos(t)+e^t\[0.5em] frac{d}{dt}x_3(t)=& x_4(t)-sin(t)cos(t)-e^t\[0.5em] frac{d}{dt}x_4(t)=& x_1(t)+x_3(t)-x_4(t)+sin^3t+cos^3t+te^{-t}\[0.5em] frac{d}{dt}x_5(t)=& x_6(t)-sin^3t-cos^3t-te^{-t}\[0.5em] frac{d}{dt}x_6(t)=& x_5(t)+x_7(t)-e^{-t}\[0.5em] frac{d}{dt}x_7(t)=& x_6(t)-e^{-t} end{bmatrix} $$ Now let's define $u_i$ so that $u_i=x_{n+i}$: $$begin{bmatrix} u_{1}(t)&=&sin tcos t + e^{t}\[0.5em] u_{3}(t)&=&-sin tcos t-e^{t}\[0.5em] u_{4}(t)&=&sin^{3} t+cos^{3} t+te^{-t}\[0.5em] u_{6}(t)&=&-sin^{3} t-cos^{3} t-te^{-t}\[0.5em] u_{7}(t)&=&-e^{-t} end{bmatrix} $$ And let's write down $dot{x}_n$: $$dot{x}_n = Ax_n + Bu_n$$ where $$A=begin{bmatrix} 1 & 0 & 1 & -1 \ 0 & 0 & 0 & 0 \ 1 & 0 & 1 & -1 \ 0 & 0 & 0 & 0 \ 0 & 1 \ \ \ \ end{bmatrix}, B=begin{bmatrix} I_{ntimes n}&O_{ntimes(n-m)}\O_{mtimes n}&O_{mtimes(m-n)} end{bmatrix}, m=7,n=4,$$ $u_n=(u_{1},u_{3},u_{4},u_{6},u_{7})^top$, $y_n=x_n$. Now let's check controllability: $operatorname{rank}[B~AB~A^{!{}^{!{}^{}}{}^{!{}^{}}}B~A^{!{}^{!{}^{}}{}^{!{}^{}}}!{}^{!{}^{}}B]=4$. Therefore controllable subspace coincides whole state-space. Let's find $P$ matrix: $$P=A-BK=begin{bmatrix} P11&P12&P13&P14\P21&P22&P23&P24\P31&P32&P33&P34\P41&P42&P43&P44\P51&&&&\&&&&\&&&&\&&& end{bmatrix}, K=begin{bmatrix} K11-K12-K13-K14 &K11+K12+K13+K14 &K11-K12+K13-K14 &K11+K12-K13+K14 \ K21-K22-K23-K24 &K21+K22+K23+K24 &K21-K22+K23-K24 &K21+K22-K23+K24 \ K31 && && \ K41 && && \ && && && end{bmatrix}, $$ where $$(P11-P32)x+y-z=-(r+s+t),~~~(P11-P32)x-y+z=u,qquad(P21-P42)x+y-z=-(r+s+t+v+w),~~~(P21-P42)x-y+z=u+v+w,$$ $$(P31)x=-r-t-v-w,qquad(P41)x=-r-v-w.$$ From $(*)$: $$(P11-P32)(y+z)=(r+s+t)-(y-z)implies P32=P11-(r+s+t)/(y+z)+(y-z)/(y+z),~~~ P33=P11+(r+s+t)/(y+z)-(y-z)/(y+z).$$ Similarly, $$(P21-P42)(y+z)=(r+s+t+v+w)-(v+w-y)implies P42=P21-(r+s+t+v+w)/(y+z)+(v+w-y)/(y+z),~~~ P43=P21+(r+s+t+v+w)/(y+z)-(v+w-y)/(y+z).$$ Now let's rewrite $(*)$: $$(-s-t-r)y+(s-t-r)x-z=-s-t-r,qquad(-s-t+r)y+(s-t+r)x+z=s-t+r,$$ $$(w+v+r)y+(w-v+r)x-z=-w-v-r,qquad(w+v-r)y+(w-v-r)x+z=w-v+r.$$ Since $z-x=y$, then: $$(w+v+r)(z-x)+(w-v+r)x-z=-w-v-r,$$ $$(w+v-r)(z-x)+(w-v-r)x+z=w-v+r.$$ Therefore, $$(-s-t-r)y+(s-t-r)(z-x)+z=-s-t-r,qquad(-s-t+r)y+(s-t+r)(z-x)-z=s-t+r.$$ From here: $$(-s-t-r)y+(s-t-r)(z-x)+(w-v+r)(z-x)+(w-v-r)x=w-v+r,$$ $$(-s-t+r)y+(s-t+r)(z-x)-(w+v-r)(z-x)+(w+v+r)x=w+v-r.$$ And finally, $$(-(s+t)r+y(s+t-w+x))((z-x))=((w-v)r+y(w-v+x))((z-x)),$$ $$(-(s+t)r-y(s+t-w+x))((z-x))=(-(w+v)r-y(w+v+x))((z-x)).$$ From here: $$(-(sr-ts-tw+tx)+(wr-wv-wx+vx))(zx-xx)=(wr-wv+y(wx-vx))zx-(wr-wv+y(wx-vx))xx,$$ which leads us either or or or Let's take first option: Then $(*)$ becomes: And therefore $B = [textcolor {red}{B}_c~~ B_u]$ where $textcolor {red}{B}_c=[c_x~~ c_y~~ c_z]$. Let's calculate eigenvalues: $lambda^n+lambda^{n−1}(−c_x−c_y−c_z)+λ^(n−2)((c_x+c_y+c_z)^{-})=λ^(n−m)$. Let's find characteristic polynomial: Characteristic polynomial $chi(lambda)=|lambda I-A+B_k|=|lambda I-P|=|lambda I-(A-B_k)|=|lambda I-A+B_k|=|lambda I-A+B_k|$: Since eigenvalues are zeros then characteristic polynomial should look something like: $chi(lambda)=(lambda-a)^k(lambda-b)^l(lambda-c)^m$, where $k+l+m=n$, $operatorname {Re}(a)<operatorname {Re}(b)<operatorname {Re}(c)$. This means: $a=c_x+c_y+c_z$, $b=c_x-c_y-c_z$, $c=c_x+c_y-c_z$. And therefore characteristic polynomial becomes: But eigenvalues should satisfy following equation: So far we know only one thing about eigenvalues: $sum _{{}_{j=}}^limits ^{{}_{}}λ_j=a+b+c$. Let's take another equation derived above: Therefore characteristic polynomial must satisfy following equation: If you multiply both sides you will get cubic equation: which leads us either Or Or Or So far there are four options left. Let's consider first option: Therefore characteristic polynomial becomes: But eigenvalues should satisfy following equation: Therefore characteristic polynomial must satisfy following equation: But if you try substitute $a+b+c=c_x+c_y+c_z$, you will see contradiction. Therefore first option doesn't work. Let's consider second option: Therefore characteristic polynomial becomes: But eigenvalues should satisfy following equation: Therefore characteristic polynomial must satisfy following equation: Which leads us either Or Or Or First case doesn't work since it contradicts condition $operatorname {Re}(a)<operatorname {Re}(b)<operatorname {Re}(c)$. Second case doesn't work since it contradicts condition $sum _{{}_{j=}}^limits ^{{}_{}}λ_j=a+b+c=c_x+c_y+c_z$: Third case doesn't work since it contradicts condition $sum _{{}_{j=}}^limits ^{{}_{}}λ_j=a+b+c=c_x+c_y+c_z$: Fourth case doesn't work since it contradicts condition $sum _{{}_{j=}}^limits ^{{}_{}}λ_j=a+b+c=c_x+c_y+c_z$: Therefore second option doesn't work either. Let's consider third option: Therefore characteristic polynomial becomes: But eigenvalues should satisfy following equation: Therefore characteristic polynomial must satisfy following equation: Which leads us either Or Or Or First case doesn't work since it contradicts condition $operatorname {Re}(a)<operatorname {Re}(b)<operatorname {Re}(c)$. Second case doesn't work since it contradicts condition $sum _{{}_{j=}}^limits ^{{}_{}}λ_j=a+b+c=c_x+c_y+c_z$: Third case doesn't work since it contradicts condition $sum _{{}_{j=}}^limits ^{{}_{}}λ_j=a+b+c=c_x+c_y+c_z$: Fourth case doesn't work since it contradicts condition $sum _{{}_{j=}}^limits ^{{}_{}}λ_j=a+b+c=c_x+c_y+c_z$: Therefore third option doesn't work either. Let's consider fourth option: Therefore characteristic polynomial becomes: But eigenvalues should satisfy following equation: Therefore characteristic polynomial must satisfy following equation: Which leads us either Or Or Or First case doesn't work since it contradicts condition $operatorname {Re}(a)<operatorname {Re}(b)<operatorname {Re}(c)$. Second case doesn't work since it contradicts condition $sum _{{}_{j=}}^limits ^{{}_{}}λ_j=a+b+c=c_x+c_y+C_Z$: Third case seems fine! Fourth case seems fine too! Let me double-check everything! It seems third variant works too! So far there are three options left: $c_w=(cx-cy)/tw$, $cw=(cy-cx)/tw$, $(cx-cy)/tw=(cy-cx)/tw$. Let me double-check everything! It seems third variant works too! So far there are two options left: $c_w=(cx-cy)/tw$, $(cx-cy)/tw=(cy-cx)/tw$. Let me double-check everything! It seems third variant works too! So far there is only one option left: $(cx-cy)/tw=(cy-cx)/tw$. Let me double-check everything! It seems third variant works too! So now let me check whether conditions satisfied! For both cases conditions satisfied! That means now I need only calculate eigenvectors! What does eigenvectors look like? Eigenvectors corresponding to zero eigenvalue are linear combinations of columns corresponding zero blocks in Jordan form matrix: $begin{pmatrix} v_c\\v_u\\\\\\\\\\\\\\\end{pmatrix}$, where $$v_c=[ϕ(x,y,z)]$$$$v_u=[ψ(x,y,z)].$$$$ϕ(x,y,z)=α[x]+β[y]+γ[z].$$$$ψ(x,y,z)=δ[x]+ε[y]+ζ[z].$$$αβγδεζ∈ℝ.$$$$βγδζ≠00000.$$ What does eigenvector corresponding nonzero eigenvalue look like? Eigenvectors corresponding nonzero eigenvalue looks something like $$ω=[ω_c~~~~ω_u],$$$$ω_c=[φ(x,y,z)],$$$$ω_u=[ϑ(x,y,z)],$$$$φ(x,y,z)=η[x]+θ[y]+κ[z],$$$$ϑ(x,y,z)=μ[x]+ν[y]+ξ[z].$$$ηθκμνξ∈ℝ.$$$$θκμν≠00000.$$ What does controllable subspace look like? Controllable subspace spanned by vectors corresponding zero eigenvalue looks something like $$C=[C_c~~~~C_u],$$$$C_c=[χ(x,y,z)],$$$$C_u=[χ′(x,y,z)].$$$χ(x,y,z)=π[x]+ρ[y]+σ[z].$$$χ′(x,y,z)=τ[x]+υ[y]+ω[z].$$$πρστυω∈ℝ.$$$$ρστυ≠00000.$$ What does uncontrollable subspace look like? Uncontrollable subspace spanned by vectors corresponding nonzero eigenvalue looks something like $$U=[U_c~~~~U_u],$$$$U_c=[ξ(x,y,z)],$$$$U_u=[ξ′(x,y,z)].$$$ξ(x,y,z)=α′[x]+β′[y]+γ′[z].$$$ξ′(x,y,z)=δ′[x]+ε′[y]+ζ′[z].$$$α′β′γ′δ′ε′ζ′∈ℝ.$$$$β′γ′δ′ε′≠00000.$$ How does observability matrix look? Observability matrix looks something like $$O=[[o_cc~~~~~~~~o_cu],[o_uc~~~~~~~~o_uu]],$$$$o_cc=[[f_xx~~~~~~~~f_xy~~~~~~~~f_xz],[f_yx~~~~~~~~f_yy~~~~~~~~f_yz],[f_zx~~~~~~~~f_zy~~~~~~~~f_zz]],$$$$o_cu=[[g_xx~~~~~~~~g_xy~~~~~~~~g_xz],[g_yx~~~~~~~~g_yy~~~~~~~~g_yz],[g_zx~~~~~~~~g_zy~~~~~~~~g_zz]],$$$$o_uc=[[l_xx~~~~~~~~l_xy~~~~~~~~l_xz],[l_yx~~~~~~~~l_yy……..l_yz],[l_zx…………l_zy……….l_zzz]],$$$$o_uu=[[m_xx~~~~~~~~~~~~~~~~m_xy~~~~~~~~~~~~~~~~m_xzz],[m__yx~~~~~~~~~~~~~~~~m__yy~~~~~~~~~~~~~~~~m__yz],[m__zx~~~~~~~~~~~~~~~~m__zy~~~~~~~~~~~~~~~~m__zz]].$$$fxxyyyzzllllllmmmmmm∈ℝ.$$$$fxxyyyzzllllllmmmmmm≠00000000000000.$$ If you take transpose you will get observability matrix:$${O}T=[[ot_cc~~~~ot_cu],[ot_uc~~~~ot_uu]]={O}.​​ If you take inverse you will get Kalman decomposition:$${O}^-¹[[ic_cc~~~~ic_cu],[ic_uc~~~~ic_uu]].​​ Kalman decomposition tells us what observable subspace looks likesomethinglike​${Y}=X_o=C_o⊕U_o,$​whereobservablecontrollablesubspaceisspannedbycolumnsofc_oc,andobservableuncontrollablesubspaceisspannedbycolumnsofu_oc. Now let me derive equations relating parameters. According Kalman decomposition ${Y}=X_o=C_o⊕U_o,$ so if ${Y}=X_o=C_o⊕U_o,$ then ${Y}-C_o={Y}-C_o-U_o-U_o={Y}-C_o-U_oc-Uou={Y}-Coc-{Y}-Cuoc-{Y}-UCoc-{Y}-UCou={Y}-Co-Uoc=${Z}. Therefore Z=X-Uo=X-Cou=X-Cuc-X-Ucu=X-Cuc-X-Ucu=X-Cuc-X-Ucu=X-Cuc-X-Ucu=X-Cuc-X-uco-uou=X-co-uco-uou=X-co-uco-uou=X-co-uco-uou=$X-Coc-X-Ucoc-X-Ucou=$X-Co-$X-$Co-$Co-$Co-$Co-$Co-$Co-Co-Co-Co-Co-Co-Co-Co-Co-Co-Co-Co=$X-O_co-O_cu-O_uc-O_uu=$X-O_co-O_cu-O_uc-O_UuC_Occ_Ocu_Ouc_Ouu_$=$X-O_co-O_cu-O_uc-O_UuC_Occ_Ocu_Ouc_Ouu_$=$X-O_co-O_cu-O_uc-O_UuC_Occ_Ocu_Ouc_Ouu_$=$X-[occ][ocu][uco][uu_]$_subscript$_subscript$_subscript$_subscript$_subscript$_subscript$_subscript$_subscript$_subscript$_subscript$_subscript{}_CCCUUUCCUUCCUUCCUUCCUUCCUUCCUUOCOOUCOUOCOOUCOUOCOOUCOUOCOOUCOUOCOOUCOUOCOOUCOUOCOOUCOUOCOOUCOUOCOOUCOUO​​ According Kalman decomposition ${Z}=N-N_C-N_U=N-N_C-N_U=N_N_C-N_C-N_U=N_N_C-N_UC_N_UC_N_UC_N_UC_N_UC_N_UC_N_UC_N_UC_N_UC_N_UC_N_UC=N-N_CCN_CCN_CCN_CCN_CCN_CCN_CCN_CCN_CCN_CCNNNNNNNNNNN=_Z=_Z=_Z=_Z=_Z=_ZN_CCN_CCN_CCN_CCN_CCN_CCN_CCN_CCN_CN-CN-CN-CN-CN-CN-CN-CN-CN-CN-CN-_ZN_CN_CN_CN_CN_CN_CN_CN_CN_CN_CN-_ZN_UN_UN_UN_UN_UN_UN_UN_UN_UNUNUNUNUNUNUNUNUNUNUNUN=_Z=N-CNUCU_______NU_______NU_______NU_______NU_______NU_______NU_______NU_______NU_______NU_______NU_______NU_=ZZZZZZZZZZZZZZZZZZZZZZZZ_Z-Z-Z-Z-Z-Z-Z-Z-Z-Z-Z-Z_Z_Z_Z_Z_Z_Z_Z_Z_Z_Z.Z.Z.Z.Z.Z.Z.Z.Z.Z._NZ_NC_NC_NC_NC_NC_NC_NC_NC_NCNCNCNCNCNCNCNCNCNCNCNC.N.N.N.N.N.N.N.N.N._NZ_UU_UU_UU_UU_UU_UU_UU_UUUUUUUUUUU.U.U.U.U.U.U.U.U.U._ZN_X_X_X_X_X_X_X_X_XNXNXNXNXNXNXNXNXNX.X.X.X.X.X.X.X.X.X._ZN_Y_Y_Y_Y_Y_Y_Y_Y_YNYNYNYNYNYNYNYNYNYNY.Y.Y.Y.Y.Y.Y.Y.Y.Y._ZN_Z_Z_Z_Z_Z_Z_Z_Z_NZNZNZNZNZNZNZNZNZN.Z.Z.Z.Z.Z.Z.Z.Z.Z.=ZXZYVWXUYTXXYYVVWWUXTYTZVVWXUYTXYVWXYZUVWTXYZUVWTXYZUVWTXYZUVWTXYZUVWTXYZUVWTXY​​ According Kalman decomposition ${W}=M-M_P-M_Q=M-M_P-M_Q=M_M_P-M_P-M_Q=M_M_P-M_Q_M_Q_M_Q_M_Q_M_Q_M_Q_M_Q_M_Q_M_Q_MQMQMQMQMQMQMQMQMQMQ.M.M.M.M.M.M.M.M.M._WM_PP_PP_PP_PP_PP_PP_PP_PPPPPPPPPPPP.P.P.P.P.P.P.P.P.P._WM_QQ_QQ_QQQQQQQQQQQQ.Q.Q.Q.Q.Q.Q.Q.Q.Q.=WWWWWWWWWWWWWWWWWWW.W.W.W.W.W.W.W.W.W._WM_W_W_W_W_W_W_W_W_WWWW_WWWW_WWWW