Hyperplane linear subspace

Date:

Share post:

We will delve into the concept of a hyperplane linear subspace, explore its properties, and examine its applications in different domains.

Hyperplanes are essential concepts in linear algebra that provide a framework for understanding linear subspaces in multidimensional spaces. They play a crucial role in various fields, including geometry, machine learning, and optimization.

What is the hyperplane linear space?

Let us start with the following definition.

Definition of the hyperplane

A hyperplane is a subspace in a higher-dimensional space that has one less dimension than the space itself.

In a two-dimensional space, a hyperplane is simply a line. In three-dimensional space, a hyperplane is a two-dimensional plane. Generally, in an n-dimensional space, a hyperplane is an (n-1)-dimensional subspace.

If $E$ is a vector space of finite dimension equal to $n\ge 2$, then for any hyperplane $H$ of $E$, we have $\dim(H)=n-1$.

Remark

Let $E$ be a vector space such that $\dim(E)=n\ge 2$ and $f:E\to \mathbb{R}$ be a linear form on $E$. According to the rank theorem $\dim(\ker(f))=n-1$. Thus $\ker(f)$ is a hyperplane of $E$. In addition if we denote by $\langle\cdot,\cdot\rangle$ the canonique product space in $E$, the there exists a unique $a\in E$ such that $f(x)=\langle x,a\rangle$.

If $H$ is an hyperplane of $E$ then there exists a unique $a\in E$ such that $$ H=\{x\in E: \langle x,a\rangle=0\}.$$

The equation of a hyperplane 

Let $E$ be a vector space of dimension $n\ge 2$ on a field $\mathbb{R}$. Thus $E$ can be viewed as the space $\mathbb{R}^n$. We then define a product space of $E$ by: for vectors $x=(x_1,\cdots,x_n)$ and $y=(y_1,\cdots,y_n)$ in $E$, with $x_i$ and $y_i$ are real numbers, then $$ \langle x,y\rangle=x_1y_1+\cdots+x_ny_n.$$ According to the above remark, if $H$ is an hyperplane of $E$, then there exists a unique $(a_1,\cdots,a_n)\in E\setminus 0$ such that $ x\in H$ if and only if $ \langle x,a\rangle=0$. thus the hyperplanes are the solutions of the equations of the form $$ a_1x_1+\cdots+a_n x_n=0.$$ Observe that the hyperplane is normal to the vector $(a_1,\cdots,a_n)$.

Exercises on the hyperplane linear subspace

Here we propose some exercises with detailed solutions to illustrate the concept of a hyperplane linear space.

Exercise 1

Let $V$ be a vector space and $n\in\mathbb{N}$, let $H$ be a hyperplane of $V$, and let $v\in V$ be a vector. Under what condition the subspaces $H$ and ${\rm span}(v):=\{\lambda v:\lambda\in\mathbb{C}\}$ are supplementally in $V$.

We shall discuss two cases: First case: if $v\in H$. Then for any $\lambda\in\mathbb{C},$ $\lambda v\in H$. Thus ${\rm span}(v)\subset H,$ so that $H$ and ${\rm span}(v)$ are not supplementally. Second case: if $v\notin H$. First of all, we have $\dim({\rm span}(v))=1$. As $H$ is a hyperplane of $V,$ it follows that $\dim(H)=n-1$. Hence \begin{align*} \dim(H)+\dim({\rm span}(v))=n=\dim(V). \end{align*} Now let $x\in H\cap {\rm span}(v)$. Then $x\in H$ and there exists $\lambda\in\mathbb{C}$ such that $x=\lambda v$. We necessarily have $\lambda=0,$ because if not, then $v=\lambda^{-1} x\in H,$ absurd. Hence $x=0_V$, and then $H\cap {\rm span}(v)={0_V}$. This shows that $H+{\rm span}(v)=V$ is a direct sum.

Exercise on hyperplanes 2

Let $\psi: \mathbb{R}^n\to \mathbb{C}$ be a nonnull linear forme and $\Phi$ be an endomorphism of $\mathbb{R}^n$. Prove that the kernel of $\psi$ is stable by $\Phi$, i.e. $\Phi(\ker(\psi))\subset \ker(\psi)$, if and only if there exists a real $\lambda\in\mathbb{R}$ such that $\psi\circ\Phi=\lambda\psi$.

Assume that there exists $\lambda\in\mathbb{R}$ such that $\psi\circ\Phi=\lambda\psi$. Let $x\in \ker(\psi)$ and prove that $\Phi(x)\in \ker(\psi)$. In fact, we have $\psi(x)=0$. On the other hand, \begin{align*} \psi(\Phi(x))=\lambda \psi(x)=0. \end{align*} This implies that $\Phi(\ker(\psi))\subset \ker(\psi)$. Conversely, assume that $\ker(\psi)$ is stable by $\Phi$. Observe that if $x\in \ker(\psi)$ then $\psi(\Phi(x))=0,$ so that \begin{align*} \psi(\Phi(x))=\lambda \psi(x),\quad \forall x\in \ker(\psi),\;\forall \lambda\in\mathbb{R}. \end{align*} It suffice then to look for real $\lambda$ and a supplementally space $K$ of $\ker(\psi)$ such that $\psi\circ\Phi=\lambda\psi$ on $K$. According to rank theorem we have $\ker(\psi)$ is a hyperplane, so $\dim(\ker(\psi))=n-1$. Thus any supplementally space $K$ of $\ker(\psi)$ will satisfies $\dim(K)=1$. Take $a\in \mathbb{R}^n$ such that $\psi(a)\neq 0$, a such $a$ exists because $\psi$ is a non null forme, so that $a\notin \ker{\psi}$. This implies that ${\rm span}(a)\cap \ker(\psi)=\{0\}$. As $\dim({\rm span}(a))=1$ and $\dim(\ker{\psi})+\dim({\rm span}(a))=n$. Then \begin{align*} \ker(\psi)\oplus {\rm span}(a)=\mathbb{R}^n. \end{align*} Then it suffices to look for real $\lambda$ such that $\psi\circ\Phi=\lambda\psi$ on ${\rm span}(a)$. In particular $\psi(\Phi(a))=\lambda \psi(a)$. We choose \begin{align*} \lambda=\frac{\psi(\Phi(a))}{\psi(a)}. \end{align*}

Properties of Hyperplanes

Hyperplanes possess several important properties that make them significant in linear algebra. Firstly, they divide the space into two regions, with points on one side satisfying the inequality $$a_1x_1 + a_2x_2 + \cdots + a_nx_n \le b,$$ and points on the other side satisfying the inequality $$a_1x_1 + a_2x_2 + \cdots + a_nx_n \ge b.$$ Additionally, the normal vector of a hyperplane is orthogonal to any vector lying within the hyperplane. This orthogonality property has applications in various mathematical and computational algorithms.

Applications of Hyperplanes

Hyperplanes find diverse applications in multiple fields. In geometry, hyperplanes help define convex sets and facilitate the study of convex hulls and polytopes. In machine learning and pattern recognition, hyperplanes are instrumental in supporting vector machines (SVM) and separating data points with different class labels. Optimization algorithms often utilize hyperplanes to define constraints and boundaries for solving optimization problems. Hyperplanes also play a role in computational geometry, computer graphics, and data visualization.

Hyperplane Intersection and Dimensionality Reduction

The intersection of hyperplanes can reveal interesting properties and relationships. When hyperplanes intersect in an n-dimensional space, the resulting subspace has a dimension less than (n-1). This intersection property can be leveraged in dimensionality reduction techniques such as Principal Component Analysis (PCA) to capture the most significant features and reduce the complexity of high-dimensional datasets.

Conclusion on hyperplane linear subspace

Hyperplanes form a fundamental concept in linear algebra, serving as linear subspaces that define boundaries and separate spaces in multidimensional settings. Understanding hyperplanes and their properties enables us to comprehend geometric relationships, perform classification tasks, and solve optimization problems. The versatility of hyperplanes is evident in their applications across diverse fields, ranging from machine learning to computational geometry. By exploring the concept of hyperplanes, mathematicians, data scientists, and researchers can unlock the potential of linear subspaces, paving the way for innovative solutions and advancements in various disciplines.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Related articles

Class 10 Maths: NCERT Solutions

Embarking on the journey of Class 10 Maths is an essential step in every student's academic path. With...

Class 9 Maths: Unlocking the Power of NCERT Solutions

Embarking on the journey of Class 9 Maths can be both exciting and challenging. However, with our meticulously...

Class 8 Maths with NCERT Solutions: A Comprehensive Guide

Are you a student struggling to grasp the concepts of Class 8 Maths? Look no further! Our website...

Class 7 Maths with Comprehensive NCERT Solutions

Looking for Class 7 Maths NCERT solutions? Look no further! Our website offers comprehensive resources to help you...