# Basic inequalities for statistical submanifolds in Golden-like statistical manifolds

• Mohamd Saleem Lone , Oğuzhan Bahadir , Choonkil Park and Inho Hwang
From the journal Open Mathematics

## Abstract

In this paper, we introduce and study Golden-like statistical manifolds. We obtain some basic inequalities for curvature invariants of statistical submanifolds in Golden-like statistical manifolds. Also, in support of our definition, we provide a couple of examples.

MSC 2010: 53C15; 53C25; 53C40; 53B25

## 1 Introduction

The comparison relationships between the intrinsic and extrinsic invariants are the basic problems in submanifold theory. In [1], Chen introduce some curvature invariants and for their usages, he derived optimal relationships between the intrinsic invariants (Chen invariants) and the extrinsic invariants, which become later an active and fruitful area of research (see, for instance, [1,2,3]).

On the other hand, the notion of Casorati curvature (extrinsic invariant) for the surfaces was originally introduced in 1890 (see [4]). The Casorati curvature gives a better intuition of the curvature compared to the Gaussian curvature. The Gaussian curvature of a developable surface is zero. Thus, Casorati put forward the notion of Casorati curvature of a surface defined as C = 1 / 2 ( 1 / κ 1 2 + 1 / κ 2 2 ) . For example, for developable surfaces (say, cylinder), the Gaussian curvature vanishes, while the Casorti curvature C surely does not vanish. The Casorati curvature of a submanifold in a Riemannian manifold is defined as the normalized square length of the second fundamental form [5].

In the past decade, various geometers attracted toward the study of Chen-type comparison relationships between the Casorati curvature and the intrinsic invariants. For some references in this direction we refer to [6,7, 8,9,10, 11,12]. The submanifolds with equality case in the Chen-type inequalities are called ideal submanifolds and the name ideal is motivated by the fact that these submanifolds inherit the least possible tension from the ambient manifold (see [13]).

In 1985, Amari introduced the notion of statistical manifolds via information geometry (see [14]). Statistical manifolds are endowed with a pair of dual torsion-free connections. This is analogous to conjugate connections in affine geometry (see [15]). The dual connections are not metric, thus it is very tough to give a notion of sectional curvature using the canonical definitions of Riemannian geometry. In [16], Opozda gave the definition of sectional curvature tensor on a statistical manifold. While studying the geometric properties of a submanifold, a very important problem is to obtain sharp relations between the intrinsic and the extrinsic invariants, and a vast number of such relations are revealed by certain inequalities. For example, let M be a surface in Euclidean 3-space, we know the Euler inequality: K H 2 , where H is the mean curvature (extrinsic property) and K is the Gaussian curvature (intrinsic property). The equality holds at points where M is congruent to an open piece of a plane or a sphere (umbilical points). Chen [17] obtained the same inequality for submanifolds of real space forms. Then in [18], Chen obtained the Chen-Ricci inequality, which is a sharp relation between the squared mean curvature and the Ricci curvature of a Riemannian submanifold of a real space form.

In recent years, statistical manifolds have been studied very actively. In [19], Takano studied statistical manifolds with almost complex and almost contact structure. In 2015, Vîlcu and Vîlcu [20] studied statistical manifolds with quaternionic settings and proposed several open problems. While answering one of those open problems, Aquib [21] obtained some of the curvature properties of submanifolds and a couple of inequalities for totally real statistical submanifolds of quaternionic Kaehler-like statistical space forms. In 2019, Chen et al. derived a Chen first inequality for statistical submanifolds in Hessian manifolds of constant Hessian curvature [22]. In the same year, following the same paper of Chen et al., Atimur et al. [23] obtained Chen-type inequalities for statistical submanifolds of Kaehler-like statistical manifolds. Very recently, in 2020, Decu et al. obtained inequalities for the Casorati curvature of statistical manifolds in holomorphic statistical manifolds of constant holomorphic curvature [24]. For some of the recent works, we refer to [15, 19, 25, 26,27,28].

Motivated by the aforementioned studies, we define Golden-like statistical manifolds and obtain certain interesting inequalities. The structure of this paper is as follows. In Section 2, we first give the definition of Golden-like statistical manifolds. We also construct an example for the Golden-like statistical manifolds. In the next section, we obtain the main inequalities. We also prove the results for their equality cases.

## 2 Golden-like statistical manifold

Let M be a smooth manifold. A ( 1 , 1 ) tensor field T on M is said to be polynomial structure if T satisfies an algebraic equation [29,30]

P ( x ) = x n + b n x n 1 + + b 2 x + b 1 I = 0 ,

where I is the ( 1 , 1 ) identity tensor field and T n 1 ( q ) , T n 2 ( q ) , , T ( q ) , I are linearly independent at every point q M . The polynomial P ( x ) is called the structure polynomial. For P ( x ) = x 2 + I and P ( x ) = x 2 I , we obtain an almost complex structure and an almost product structure, respectively. It has to be noted here that the existence of almost complex structure implies the even dimensions of the manifold. For P ( x ) = x 2 , we obtain the notion of an almost tangent structure.

## Definition 1

[29,31,32] Let ( M , g ) be the a semi-Riemannian manifold and let ϕ be the ( 1 , 1 ) tensor field on M satisfying the following equation:

ϕ 2 = ϕ + I .

Then the tensor field ϕ is called a Golden structure on M . If the Riemannian metric g is ϕ compatible, the ( M , g , ϕ ) is called a Golden semi-Riemannian manifold.

For ϕ compatible metric g , we have the following:

(2.1) g ( ϕ X , Y ) = g ( X , ϕ Y ) ,

(2.2) g ( ϕ X , ϕ Y ) = g ( ϕ 2 X , Y ) = g ( ϕ X , Y ) + g ( X , Y ) , X , Y Γ ( T M ) .

A remarkable fact about Golden structures is its appearance in pairs, i.e., if ϕ is Golden structure, the ϕ ˆ = I ϕ is also a Golden structure. But same is the case with almost tangent ( R and R ) and almost complex structure ( J and J ). So it is natural to ask the connection between Golden and product structures.

Let M be a Riemannian manifold. Denote a torsion-free affine connection by . The triple ( M , , g ) is called a statistical manifold if g is symmetric. We define another affine connection by

(2.3) X g ( Y , Z ) = g ( X Y , Z ) + g ( X Z , Y )

for vector fields E , F , and G on M . The affine connection is called conjugate (or dual) to with respect to g . The affine connection is torsion-free, g is symmetric and satisfies 0 = + 2 . Clearly, the triple ( M , , g ) is statistical. We denote by R and R the curvature tensors on M with respect to the affine connection and its conjugate , respectively. Also the curvature tensor field R 0 associated with the 0 is called Riemannian curvature tensor. Then we find

g ( R ( X , Y ) Z , W ) = g ( Z , R ( X , Y ) W )

for vector fields X , Y , Z , and W on M , where R ( X , Y ) Z = [ X , Y ] Z [ X , Y ] Z .

In general, the dual connections are not metric, one cannot define the sectional curvature in statistical environment as in the case of semi-Riemannian geometry. Thus, Opozda proposed two notions of sectional curvature on statistical manifolds (see [16,33]).

Let M be a statistical manifold and π a plane section in T M with orthonormal basis { X , Y } , then the sectional K -curvature is defined in [16] as

K ( π ) = 1 2 [ g ( R ( X , Y ) Y , X ) + g ( R ( X , Y ) Y , X ) g ( R 0 ( X , Y ) Y , X ) ] .

## Definition 2

Let ( M , g , ϕ ) be a Golden semi-Riemannian manifold endowed with a tensor field ϕ of type (1,1) satisfying

(2.4) g ( ϕ X , Y ) = g ( X , ϕ Y )

for vector fields X and Y . In view of (2.4), we easily derive

(2.5) ( ϕ ) 2 X = ϕ X + X ,

(2.6) g ( ϕ X , ϕ Y ) = g ( ϕ X , Y ) + g ( X , Y ) .

Then ( M , g , ϕ ) is called Golden-like statistical manifold.

According to (2.5) and (2.6), the tensor fields ϕ + ϕ and ϕ ϕ are symmetric and skew symmetric with respect to g , respectively. The equations (2.4), (2.5), and (2.6) imply the following proposition.

## Proposition 1

( M , g , ϕ ) is a Golden-like statistical manifold if and only if it is ( M , g , ϕ ) .

We remark that if we choose ϕ = ϕ in a Golden-like statistical manifold, then we have a Golden semi-Riemannian manifold.

We first present an example of a Golden-Riemannian manifold.

## Example 1

[34] Consider the Euclidean 6-space R 6 with standard coordinates ( x 1 , x 2 , x 3 , x 4 , x 5 , x 6 ) . Let ϕ be an ( 1 , 1 ) tensor field on R 6 defined by

ϕ ( x 1 , x 2 , x 3 , x 4 , x 5 , x 6 ) = ( ψ x 1 , ψ x 2 , ψ x 3 , ( 1 ψ ) x 4 , ( 1 ψ ) x 5 , ( 1 ψ ) x 6 )

for any vector field ( x 1 , x 2 , x 3 , x 4 , x 5 , x 6 ) R 6 , where ψ = 1 + 5 2 and 1 ψ = 1 5 2 are the roots of the equation x 2 = x + 1 . Then we obtain

ϕ 2 ( x 1 , x 2 , x 3 , x 4 , x 5 , x 6 ) = ( ψ 2 x 1 , ψ 2 x 2 , ψ 2 x 3 , ( 1 ψ ) 2 x 4 , ( 1 ψ ) 2 x 5 , ( 1 ψ ) 2 x 6 ) = ( ψ x 1 , ψ x 2 , ψ x 3 , ( 1 ψ ) x 4 , ( 1 ψ ) x 5 , ( 1 ψ ) x 6 ) + ( x 1 , x 2 , x 3 , x 4 , x 5 , x 6 ) .

Thus, we have ϕ 2 = ϕ + I . Moreover, we can easily see that standard metric , on R 6 is ϕ compatible. Hence, ( R 6 , , , ϕ ) is a Golden Riemannian manifold.

Next, we construct an example of a Golden-like statistical manifold in the following example.

## Example 2

Consider the semi-Euclidean space R 1 3 with standard coordinates ( x 1 , x 2 , x 3 ) and the semi-Riemannian metric g with the signature ( , + , + ) . Let ϕ be an ( 1 , 1 ) tensor field on R 1 3 defined by

ϕ ( x 1 , x 2 , x 3 ) = 1 2 ( x 1 + 5 x 2 , x 2 + 5 x 1 , 2 ψ x 3 )

for any vector field ( x 1 , x 2 , x 3 ) R 1 3 , where ψ = 1 + 5 2 is the Golden mean. Then we obtain ϕ 2 = ϕ + I , this implies that ϕ is a Golden structure on R 1 3 .

Now we define an ( 1 , 1 ) tensor field ϕ on R 1 3 by

ϕ ( x 1 , x 2 , x 3 ) = 1 2 ( x 1 5 x 2 , x 2 5 x 1 , 2 ψ x 3 ) .

Thus, we have ϕ 2 = ϕ + I . Moreover, we have the equation (2.4). Hence, ( R 1 3 , g , ϕ ) is a Golden-like simplified statistical manifold.

Now we give a generalized example of the above example.

## Example 3

Let R n be a ( 2 n + m ) -dimensional affine space with the coordinate system ( x 1 , , x n , y 1 , , y n , z 1 , , z m ) . Assume we define a semi-Riemannian metric g with the signature ( , , n t i m e s , + , , + ( n + m ) t i m e s ) and the tensor field ϕ as follows:

ϕ = 1 2 δ i j 5 δ i j 0 5 δ i j δ i j 0 0 0 ψ ,

where ψ is the Golden mean. Then ϕ is golden structure on R n 2 n + m . Moreover, if the conjugate tensor field ϕ is defined as

ϕ = 1 2 δ i j 5 δ i j 0 5 δ i j δ i j 0 0 0 ψ .

Then we can easily see that ( R n 2 n + m , g , ϕ ) and ( R n 2 n + m , g , ϕ ) are Golden-like statistical manifolds. Also, this verifies Proposition 1.

Let ( M = M p ( c p ) × M q ( c q ) , g , ϕ ) be a Golden product space form. Then the Riemannian curvature tensor R of M is given by [32]:

(2.7) R ( X , Y ) Z = ( 1 ψ ) c p ψ c q 2 5 { g ( Y , Z ) X g ( X , Z ) Y + g ( ϕ Y , Z ) ϕ X g ( ϕ X , Z ) ϕ Y } + ( 1 ψ ) c p + ψ c q 4 { g ( ϕ Y , Z ) X g ( ϕ X , Z ) Y + g ( Y , Z ) ϕ X g ( X , Z ) ϕ Y } ,

where M p and M q are space forms with constant sectional curvatures c p and c q , respectively. We can obtain the curvature tensor R with respect to dual connection just by replacing ϕ by ϕ .

Let M n be statistical submanifold of ( N m , g , ϕ ) . The Gauss and Weingarten formulae are

X Y = X Y + σ ( X , Y ) , X ξ = A ξ X + X ξ X Y = X Y + σ ( X , Y ) , X ξ = A ξ X + X ξ

for all X , Y T M and ξ T M , respectively. Moreover, we have the following equations:

X g ( Y , Z ) = g ( X Y , Z ) + g ( Y , X Z ) g ( σ ( X , Y ) , ξ ) = g ( A ξ X , Y ) , g ( σ ( X , Y ) , ξ ) = g ( A ξ X , Y ) X g ( ξ , η ) = g ( X ξ , η ) + g ( ξ , X η ) .

The mean curvature vector fields for an orthonormal tangent frame { e 1 , e 2 , , e n } and a normal frame { e n + 1 , , e m } , respectively, are defined as

H = 1 n i = 1 n σ ( e i , e i ) = 1 n γ = n + 1 m i = 1 n σ i i γ ξ γ , σ i j γ = g ( σ ( e i , e j ) , e γ )

and

H = 1 n i = 1 n σ ( e i , e i ) = 1 n γ = n + 1 m i = 1 n σ i i γ ξ γ , σ i j γ = g ( σ ( e i , e j ) , e γ )

for 1 i , j n , and 1 l m . Moreover, we have 2 h 0 = h + h and 2 H 0 = H + H , where the second fundamental form h 0 and the mean curvature H 0 are calculated with respect to Levi-Civita connection 0 on M .

The squared mean curvatures are defined as

H 2 = 1 n 2 γ = n + 1 m i = 1 n σ i i γ 2 , H 2 = 1 n 2 γ = n + 1 m i = 1 n σ i i γ 2 .

The Casorati curvatures are defined as

C = 1 n γ = n + 1 m i , j = 1 n ( σ i j γ ) 2 , C = 1 n γ = n + 1 m i , j = 1 n ( σ i j γ ) 2 .

If we suppose that W is a d -dimensional subspace of T M , d 2 , and { e 1 , e 2 , , e d } is an orthonormal basis of W , then the scalar curvature of the d -plane section is given as

τ ( W ) = 1 u < v d K ( e u e v ) ,

and the normalized scalar curvature ρ is defined as

ρ = 2 τ s ( s 1 ) .

Also, the Casorati curvature of the subspace W is given by

C ( W ) = 1 d γ = r + 1 m i , j = 1 d ( σ i j γ ) 2 , C ( W ) = 1 d γ = r + 1 m i , j = 1 d ( σ i j γ ) 2 .

A point x M is called as quasi-umbilical point, if at x there exist m n mutually orthogonal unit normal vectors e i , i { n + 1 , , m } in a way the shape operators with respect to all vectors e i have an eigenvalue with multiplicity n 1 and for each e i the distinguished eigen vector is the same.

The normalized δ -Casorati curvatures δ c ( n 1 ) and δ ^ c ( n 1 ) of the submanifold M s are, respectively, given by

[ δ c ( n 1 ) ] x = 1 2 C x + n + 1 2 s inf { C ( W ) W a hyperplane of T x M }

and

[ δ ^ c ( n 1 ) ] x = 2 C x 2 n 1 2 n sup { C ( W ) W a hyperplane of T x M } .

In [5], Decu et al. generalized the notion of normalized δ -Casorati curvature to the generalized normalized δ -Casorati curvatures δ C ( k ; n 1 ) and δ ^ C ( k ; n 1 ) . For a submanifold M n and for any positive real number k n ( n 1 ) , the generalized normalized δ -Casorati curvature is given by:

[ δ C ( k ; n 1 ) ] x = k C x + ( n 1 ) ( n + k ) ( n 2 n k ) k n inf { C ( W ) W a hyperplane of T x M } ,

if 0 < k < n 2 n , and

[ δ ^ C ( k ; n 1 ) ] x = k C x ( n 1 ) ( n + k ) ( k n 2 + n ) k n sup { C ( W ) W a hyperplane of T x M } ,

if k > n 2 n .

The generalized normalized δ -Casorati curvatures δ C ( k : n 1 ) and δ ˆ C ( k : n 1 ) are generalizations of normalized δ -Casorati curvatures δ C ( n 1 ) and δ ˆ C ( n 1 ) . In fact, we have the following relations (see [5]):

(2.8) δ C n ( n 1 ) 2 ; n 1 x = n ( n 1 ) [ δ C ( n 1 ) ] x ,

(2.9) [ δ ˆ C ( 2 n ( n 1 ) ; n 1 ) ] x = n ( n 1 ) [ δ ˆ C ( n 1 ) ] x .

In the same way, the dual Casorati curvatures are obtained just by replacing δ and δ and C by C .

Now, we state the following fundamental results on statistical manifolds.

## Proposition 2

[20] Let M be statistical submanifold of ( M , g , ϕ ) . Let R and R be the Riemannian curvature tensors on M for and , respectively. Then we have the following.

g ( R ( X , Y ) Z , W ) = g ( R ( X , Y ) Z , W ) + g ( σ ( X , Z ) , σ ( Y , W ) ) g ( σ ( X , W ) , σ ( Y , Z ) ) , g ( R ( X , Y ) Z , W ) = g ( R ( X , Y ) Z , W ) + g ( σ ( X , Z ) , σ ( Y , W ) ) g ( σ ( X , W ) , σ ( Y , Z ) ) , g ( R ( X , Y ) ξ , η ) = g ( R ( X , Y ) ξ , η ) + g ( [ A ξ , A η ] X , Y ) , g ( R ( X , Y ) ξ , η ) = g ( R ( X , Y ) ξ , η ) + g ( [ A ξ , A η ] X , Y ) ,

where [ A ξ , A η ] = A ξ A η A η A ξ and [ A ξ , A η ] = A ξ A η A η A ξ , for X , Y , Z , W T M and ξ , η T M .

Now, we state two important lemmas which we use to prove the main results in the upcoming sections.

## Lemma 1

Let n 3 be an integer and a 1 , a 2 , , a n are n real numbers. Then, we have

1 i < j n n a i a j a 1 a 2 n 2 2 ( n 2 ) i = 1 n a i 2 .

Moreover, the equality holds if and only if a 1 + a 2 = a 3 = = a n .

The optimization techniques have a pivotal role in improving inequalities involving Chen invariants. Oprea [35] applied the constrained extremum problem to prove Chen-Ricci inequalities for Lagrangian submanifolds of complex space forms. In the characterization of our main result, we will use the following lemma.

Let M be a Riemannian submanifold in a Riemannian manifold ( M ˆ , g ) and y : M ˆ R be a differentiable function. If we have the constrained extremum problem

(2.10) min x M [ y ( x ) ] .

Then we have the following lemma.

## Lemma 2

[35] If x 0 M is a solution of the problem (2.10), then

1. ( grad y ) ( x 0 ) T x 0 M ;

2. The bilinear form Ω : T x 0 M × T x 0 M R defined by

Ω ( X , Y ) = Hess y ( X , Y ) + g ( σ ( X , Y ) , ( grad y ) ( x 0 ) )

is positive semi-definite, where σ is the second fundamental form of M in M ˆ and grad y is the gradient of y .

In principle, the bilinear form Ω is Hess y M ( x 0 ) . Therefore, if Ω is positive semi-definite on M , then the critical points of y M , which coincide with the points where grad y is normal to M , are global optimal solutions of the problem (2.6) (for instance see [36, Remark 3.2]).

## 3 Main inequalities

Let π be a two plane spanned by { e 1 , e 2 } and denote g ( ϕ e 1 , e 1 ) g ( ϕ 2 , e 2 ) = Ψ ( π ) . Also, as in [37], for an orthonormal basis { e 1 , e 2 } of two-plane section, we denote Θ ( π ) = g ( ϕ e 1 , e 2 ) g ( ϕ e 1 , e 2 ) , where Θ ( π ) is a real number in [ 0 , 1 ] .

## Theorem 1

Let ( N , g , ϕ ) be a Golden-like statistical manifold of dimension m and M be its statistical submanifold of dimension n . Then, we have the following:

( τ K ( π ) ) ( τ 0 K 0 ( π ) ) ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) + tr 2 ( ϕ ) tr ( ϕ ) ] + ( 1 ψ ) c p + ψ c q 4 2 ( n 1 ) tr ( ϕ ) + ( 1 ψ ) c p ψ c q 2 5 [ 1 + Ψ ( π ) + Θ ( π ) ] n 2 ( n 2 ) 4 ( n 1 ) [ H 2 + H 2 ] + 2 K ˆ 0 ( π ) 2 τ ˆ 0 .

## Proof

Let { e 1 , e 2 , , e n } and { e n + 1 , , e m } be the orthonormal frames of T M and T M , respectively.

The scalar curvature corresponding to the sectional K -curvature is

τ = 1 2 1 i < j n [ g ( R ( e i , e j ) e j , e i ) + g ( R ( e i , e j ) e j , e i ) 2 g ( R 0 ( e i , e j ) e j , e i ) ] .

Using (2.7) and Gauss equation for R and R and doing some simple calculations, we obtain

τ = ( 1 ψ ) c p ψ c q 2 5 [ n ( n 1 ) + tr 2 ( ϕ ) tr ( ϕ 2 ) ] + ( 1 ψ ) c p + ψ c q 4 2 ( n 1 ) tr ( ϕ ) τ 0 + 1 2 γ = n + 1 m 1 i < j n [ σ i i γ σ j j γ + σ i i γ σ j j γ 2 σ i j γ σ i j γ ] .

In view of (2.5), we obtain

τ = ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) + tr 2 ( ϕ ) tr ( ϕ ) ] + ( 1 ψ ) c p + ψ c q 4 2 ( n 1 ) tr ( ϕ ) τ 0 + 1 2 γ = n + 1 m 1 i < j n [ σ i i γ σ j j γ + σ i i γ σ j j γ 2 σ i j γ σ i j γ ] ,

which can be written as

τ = ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) + tr 2 ( ϕ ) tr ( ϕ ) ] + ( 1 ψ ) c p + ψ c q 4 2 ( n 1 ) tr ( ϕ ) τ 0 + 2 γ = n + 1 m 1 i < j n [ σ i i 0 γ σ j j 0 γ ( σ i j 0 γ ) 2 ] 1 2 γ = n + 1 m 1 i < j n [ { σ i i γ σ j j γ + ( σ i j γ ) 2 } + { σ i i γ σ j j γ ( σ i j γ ) 2 } ] .

By using Gauss equation for the Levi-Civita connection, we have

(3.1) τ = τ 0 + ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) + tr 2 ( ϕ ) tr ( ϕ ) ] + ( 1 ψ ) c p + ψ c q 4 2 ( n 1 ) tr ( ϕ ) 2 τ ˆ 0 1 2 γ = n + 1 m 1 i < j n [ { σ i i γ σ j j γ ( σ i j γ ) 2 } + { σ i i γ σ j j γ ( σ i j γ ) 2 } ] .

Now, the sectional K -curvature K ( π ) of the plane section π is

(3.2) K ( π ) = 1 2 [ g ( R ( e 1 , e 2 ) e 2 , e 1 ) + g ( R ( e 1 , e 2 ) e 2 , e 1 ) 2 g ( R 0 ( e 1 , e 2 ) e 2 , e 1 ) ] .

Using (2.7) and Gauss equation for R and R and putting the values in (3.2), we obtain

K ( π ) = ( 1 ψ ) c p ψ c q 2 5 [ 1 + g ( ϕ e 1 , e 1 ) g ( ϕ 2 , e 2 ) g ( ϕ e 1 , e 2 ) g ( ϕ e 2 , e 1 ) ] K 0 ( π ) + 1 2 γ = n + 1 m { [ σ 11 γ σ 22 γ + σ 11 γ σ 22 γ 2 σ 12 γ σ 12 γ ] } .

Using σ + σ = 2 σ 0 , we obtain

K ( π ) = ( 1 ψ ) c p ψ c q 2 5 [ 1 + g ( ϕ e 1 , e 1 ) g ( ϕ 2 , e 2 ) g ( ϕ e 1 , e 2 ) g ( ϕ 2 , e 1 ) ] K 0 ( π ) + 2 γ = n + 1 m [ σ 11 0 γ σ 22 0 γ ( σ 12 0 γ ) 2 ] 1 2 γ = n + 1 m { [ σ 11 γ σ 22 γ ( σ 12 γ ) 2 ] + [ σ 11 γ σ 22 γ ( σ 12 γ ) 2 ] } .

Using Gauss equation with respect to Levi-Civita connection, we have

K ( π ) = K 0 ( π ) + ( 1 ψ ) c p ψ c q 2 5 [ 1 + g ( ϕ e 1 , e 1 ) g ( ϕ 2 , e 2 ) + g ( ϕ e 1 , e 2 ) g ( ϕ e 1 , e 2 ) ] 2 K ˆ 0 ( π ) 1 2 γ = n + 1 m [ σ 11 γ σ 22 γ ( σ 12 γ ) 2 ] 1 2 γ = n + 1 m [ σ 11 γ σ 22 γ ( σ 12 γ ) 2 ] .

The above equation can be written in the form

(3.3) K ( π ) = K 0 ( π ) + ( 1 ψ ) c p ψ c q 2 5 [ 1 + Ψ ( π ) + Θ ( π ) ] 2 K ˆ 0 ( π ) 1 2 γ = n + 1 m [ σ 11 γ σ 22 γ ( σ 12 γ ) 2 ] 1 2 γ = n + 1 m [ σ 11 γ σ 22 γ ( σ 12 γ ) 2 ] .

From (3.1) and (3.3), we have

( τ K ( π ) ) ( τ 0 K 0 ( π ) ) = ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) + tr 2 ( ϕ ) tr ( ϕ ) ] + ( 1 ψ ) c p + ψ c q 4 2 ( n 1 ) tr ( ϕ ) + ( 1 ψ ) c p ψ c q 2 5 [ 1 + Ψ ( π ) + Θ ( π ) ]

1 2 γ = n + 1 m [ σ i i γ σ j j γ ( σ i j γ ) 2 ] 1 2 γ = n + 1 m [ σ i i γ σ j j γ ( σ i j γ ) 2 ] + 1 2 γ = n + 1 m α = 1 3 { [ σ 11 γ σ 22 γ ( σ 12 γ ) 2 ] + [ σ 11 γ σ 22 γ ( σ 12 γ ) 2 ] } + 2 K ˆ 0 ( π ) 2 τ ˆ 0 .

Using Lemma 1, we can obtain the above equation in simplified form as

( τ K ( π ) ) ( τ 0 K 0 ( π ) ) ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) + tr 2 ( ϕ ) tr ( ϕ ) ] + ( 1 ψ ) c p + ψ c q 4 2 ( n 1 ) tr ( ϕ ) + ( 1 ψ ) c p ψ c q 2 5 [ 1 + Ψ ( π ) + Θ ( π ) ] n 2 ( n 2 ) 4 ( n 1 ) [ H 2 + H 2 ] + 2 K ˆ 0 ( π ) 2 τ ˆ 0 .

This proves our claims.□

## Corollary 1

Let ( N , g , ϕ ) be a Golden-like statistical manifold of dimension m and M be its totally real statistical submanifold of dimension n . Then, we have the following

( τ K ( π ) ) ( τ 0 K 0 ( π ) ) ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) 1 ] n 2 ( n 2 ) 4 ( n 1 ) [ H 2 + H 2 ] + 2 K ˆ 0 ( π ) 2 τ ˆ 0 .

## Theorem 2

Let M n be a statistical submanifold of a Golden-like statistical manifold N m . Then for the generalized normalized δ -Casorati curvature, we have the following optimal relationships:

1. For any real number k , such that 0 < k < n ( n 1 ) ,

(3.4) ρ δ C 0 ( k ; n 1 ) n ( n 1 ) + 1 ( n 1 ) C 0 n ( n 1 ) g ( H , H ) 2 n n ( n 1 ) H 0 2 + 1 n ( n 1 ) ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) + tr 2 ( ϕ ) tr ( ϕ ) ] + 2 n ( 1 ψ ) c p + ψ c q 4 tr ( ϕ ) ,

where δ C 0 ( k ; n 1 ) = 1 2 [ δ C ( k ; n 1 ) + δ C ( k ; n 1 ) ] .

2. For any real number k > n ( n 1 ) ,

(3.5) ρ δ ^ C 0 ( k ; n 1 ) n ( n 1 ) + 1 ( n 1 ) C 0 n ( n 1 ) g ( H , H ) 2 n n ( n 1 ) H 0 2 + 1 n ( n 1 ) ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) + tr 2 ( ϕ ) tr ( ϕ ) ] + 2 n ( 1 ψ ) c p + ψ c q 4 tr ( ϕ ) ,

where δ ^ C 0 ( k ; n 1 ) = 1 2 [ δ ^ C ( k ; n 1 ) + δ ^ C ( k ; n 1 ) ] .

## Proof

Let p M and { e 1 , , e n } , { e n + 1 , , e m } be the orthonormal basis of T p M and T p M , respectively. From Gauss equation, we obtain

2 τ = n 2 g ( H , H ) n 1 i , j n g ( σ ( e i , e j ) , σ ( e i , e j ) ) + ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) + tr 2 ( ϕ ) tr ( ϕ ) ] + ( 1 ψ ) c p + ψ c q 4 2 ( n 1 ) tr ( ϕ ) .

Denote H + H = 2 H 0 and C + C = 2 C 0 . Then the above equation becomes

2 τ = ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) + tr 2 ( ϕ ) tr ( ϕ ) ] + ( 1 ψ ) c p + ψ c q 4 2 ( n 1 ) tr ( ϕ ) + 2 n 2 H 0 2 n 2 2 ( H 2 + H 2 ) 2 n C 0 + n 2 ( C + C ) .

We define a polynomial P in the components of second fundamental form as:

(3.6) P = k C 0 + a ( k ) C 0 ( W ) + n 2 ( C + C ) n 2 2 ( H 2 + H 2 ) 2 τ ( p ) + ( 1 ψ ) c p ψ c q 2 5 [ n ( n 2 ) + tr 2 ( ϕ ) tr ( ϕ ) ] + ( 1 ψ ) c p + ψ c q 4 2 ( n 1 ) tr ( ϕ ) ,

where W is a hyperplane in T p M . Assuming W is spanned by { e 1 , , e n 1 } , we have

(3.7) P = α = n + 1 m 2 n + k n i , j = 1 n ( σ i j 0 α ) 2 + a ( k ) 1 n 1 i , j = 1 n 1 ( σ i j 0 α ) 2 2 i , j = 1 n σ i j 0 α 2 .

Equation (3.7) yields

P = α = n + 1 n 2 ( 2 n + k ) n + 2 a ( k ) n 1 1 i < j n 1 ( σ i j 0 α ) 2 + 2 ( 2 n + k ) n + 2 a ( k ) n 1 i = 1 n 1 ( σ i n 0 α ) 2 2 n + k n + a ( k ) n 1 2 i = 1 n 1 ( σ i i 0 α ) 2 4 1 i < j n σ i i 0 α σ j j 0 α + 2 n + k n 2 ( σ n n 0 α ) 2 α = n + 1 m k ( n 1 ) + a ( k ) n n ( n 1 ) i = 1 n 1 ( σ i i 0 α ) 2 + k m ( σ n n 0 α ) 2 4 1 i < j n σ i i 0 α σ j j 0 α .

Let y α be a quadratic form defined by y α = R n R for any α { n + 1 , , m } ,

y α ( σ 11 0 α , σ 22 0 α , , σ n n 0 α ) = i = 1 n 1 k ( n 1 ) + a ( k ) n n ( n 1 ) ( σ i i 0 α ) 2 + k n ( σ n n 0 α ) 2 4 1 i < j n σ i i 0 α σ j j 0 α ,

and the optimization problem

min { y α }

subject to G : σ 11 0 α + σ 22 0 α + + σ n n 0 α = p α , where p α is a real constant.

The partial derivatives of y α are

(3.8) y α σ i i 0 α = 2 k ( n 1 ) + a ( k ) n n ( n 1 ) σ i i 0 α 4 l = 1 n σ l l 0 α σ i i 0 α = 0 y α σ n n 0 α = 2 k n σ n n 0 α 4 l = 1 n 1 σ l l 0 α = 0 ,

for every i { 1 , , n 1 } , α { n + 1 , , m } .

For an optimal solution ( σ 11 0 α , σ 22 0 α , , σ n n 0 α ) of the problem, the vector grad y α is normal at G . It is collinear with the vector ( 1 , , 1 ) . From (3.8) and i = 1 n σ i i 0 α = p α it follows that a critical point of the corresponding problem has the form

σ i i 0 α = 2 n ( n 1 ) ( n 1 ) ( 2 n + k ) + n a ( k ) p α σ n n 0 α = 2 n 2 n + k p α ,

for any i { 1 , , n 1 } , α { n + 1 , , m } .

For p G , let A be a 2-form, A : T p G × G R defined by

A ( X , Y ) = Hess ( y α ) ( X , Y ) + σ ( X , Y ) , ( grad ( y ) ( p ) ) ,

where σ is the second fundamental form of G in R n and , is the standard inner product on R n .

Moreover, it is easy to see that the Hessian matrix of y α has the form

Hess ( y α ) = 2 ( n 1 ) ( k + 2 n ) + n a ( k ) n ( n 1 ) 4 4 4 4 2 ( n 1 ) ( k + 2 n ) + n a ( k ) n ( n 1 ) 4 4 4 4 2 ( n 1 ) ( k + 2 n ) + n a ( k ) n ( n 1 ) 4 4 4 4 2 k n .

As G is a totally geodesic hyperplane in R n , considering a vector X = ( X 1 , , X n ) tangent to G at an arbitrary point x on G , that is, verifying the relation i = 1 n = 0 , we obtain

A ( X , X ) = 2 ( n 1 ) ( k + 2 n ) + n a ( k