Higher-dimension Tensor Completion via Low-rank Tensor ...

6
Higher-dimension Tensor Completion via Low-rank Tensor Ring Decomposition Longhao Yuan *† , Jianting Cao §* , Xuyang Zhao *† , Qiang Wu ¶k and Qibin Zhao †‡ * Graduate School of Engineering, Saitama Institute of Technology, Japan Tensor Learning Unit, RIKEN Center for Advanced Intelligence Project (AIP), Japan School of Automation, Guangdong University of Technology, China § School of Computer Science and Technology, Hangzhou Dianzi University, China School of Information Science and Engineering, Shandong University, China k Institute of Brain and Brain-Inspired Science, Shandong University, China Abstract—The problem of incomplete data is common in signal processing and machine learning. Tensor completion algorithms aim to recover the incomplete data from its partially observed en- tries. In this paper, taking advantages of high compressibility and flexibility of recently proposed tensor ring (TR) decomposition, we propose a new tensor completion approach named tensor ring weighted optimization (TR-WOPT). It finds the latent factors of the incomplete tensor by gradient descent algorithm, then the latent factors are employed to predict the missing entries of the tensor. We conduct various tensor completion experiments on synthetic data and real-world data. The simulation results show that TR-WOPT performs well in various high-dimension tensors. Furthermore, image completion results show that our proposed algorithm outperforms the state-of-the-art algorithms in many situations. Especially when the missing rate of the test images is high (e.g., over 0.9), the performance of our TR-WOPT is significantly better than the compared algorithms. I. I NTRODUCTION Tensors are high-dimension representations of vectors and matrices. Many kinds of data in real-world, for example, color images (length × width × RGB channels), videos (length × width × RGB channels × time) and electroen- cephalography (EEG) signals (magnitude×trails×time) are more than two dimensions. Usually, the high-dimension data is first transformed to vector or matrix, only then can the data be applied to traditional algorithms. In that way, the adjacent structure information of the original data will be lost, thus lead to redundant space cost and low-efficiency computation [20]. Tensor representation can retain the high dimension of data and solve the above problems. Tensor has been studied for more than a century, many methodologies have been proposed [14]. Moreover, tensor has been applied in various research field such as signal processing [6], machine learning [2], data completion [1], brain-computer interface (BCI) [16], etc.. CANDECOMP/PARAFAC (CP) decomposition [3] and Tucker decomposition [22] are the most popular and classic tensor decomposition models, of which many theories and applications have been proposed [21]. In recent years, a new Corresponding authors: Jianting Cao ([email protected]) and Qibin Zhao ([email protected]). theory system named tensor network has drawn people’s atten- tion and becomes a promising aspect of tensor methodology [5], [7]. One of the most representative tensor decomposition models of tensor network is matrix product state (MPS), which is also known as tensor train (TT) decomposition [19]. TT decomposition shows high data compression ability and computational efficiency. One of the most significant features of TT decomposition is that it can overcome the “curse of dimensionality”, i.e., for an N -dimension tensor, the number of parameters of Tucker decomposition is exponential in N , while the number of parameters of TT decomposition is linear in N . Although CP decomposition achieves high compression ability and the number of parameters is also linear in N , finding the optimal latent factors of CP decomposition is very difficult. Recently, a new tensor decomposition model named tensor ring (TR) decomposition which is more generalized than TT decomposition has been proposed [27]. Tensor ring owns all the dominant properties TT has and it has a more flexible model constraint than tensor train. TR relaxes the rank constraint of TT, thus lead to more interesting properties, e.g., enhanced compression ability, interpretability of latent factors and rotational invariance ability of latent factors [25]. Several papers have studied and applied TR decomposition in machine learning fields [23], [4], in which TR shows promising aspects in representation ability and computational efficiency. Tensor completion is to recover an incomplete tensor from the partially observed entries of the tensor, which has been applied in various completion problems such as image/video completion [15], [28], compressed sensing [10], link prediction [18], recommendation system [13], etc.. The theoretical key point of matrix completion and tensor completion is the low-rank assumption of data. There exists strong theoretical support and various solutions for solving the low-rank problem of matrices, and the most studied convex relaxation of low- rank matrix is nuclear norm [12]. However, determining the rank of a tensor is an NP-hard problem [11]. To solve this problem, there are mainly two types of tensor completion methods, “rank minimization based” approach and “tensor decomposition based” approach [17]. The first approach for- mulates the convex surrogate models of low-rank tensors. The most representative model of this kind of tensor completion 1071 Proceedings, APSIPA Annual Summit and Conference 2018 12-15 November 2018, Hawaii 978-988-14768-5-2 ©2018 APSIPA APSIPA-ASC 2018

Transcript of Higher-dimension Tensor Completion via Low-rank Tensor ...

Page 1: Higher-dimension Tensor Completion via Low-rank Tensor ...

Higher-dimension Tensor Completion via Low-rankTensor Ring Decomposition

Longhao Yuan∗†, Jianting Cao§∗, Xuyang Zhao∗†, Qiang Wu¶‖ and Qibin Zhao†‡∗ Graduate School of Engineering, Saitama Institute of Technology, Japan

† Tensor Learning Unit, RIKEN Center for Advanced Intelligence Project (AIP), Japan‡ School of Automation, Guangdong University of Technology, China

§ School of Computer Science and Technology, Hangzhou Dianzi University, China¶ School of Information Science and Engineering, Shandong University, China‖ Institute of Brain and Brain-Inspired Science, Shandong University, China

Abstract—The problem of incomplete data is common in signalprocessing and machine learning. Tensor completion algorithmsaim to recover the incomplete data from its partially observed en-tries. In this paper, taking advantages of high compressibility andflexibility of recently proposed tensor ring (TR) decomposition,we propose a new tensor completion approach named tensor ringweighted optimization (TR-WOPT). It finds the latent factors ofthe incomplete tensor by gradient descent algorithm, then thelatent factors are employed to predict the missing entries of thetensor. We conduct various tensor completion experiments onsynthetic data and real-world data. The simulation results showthat TR-WOPT performs well in various high-dimension tensors.Furthermore, image completion results show that our proposedalgorithm outperforms the state-of-the-art algorithms in manysituations. Especially when the missing rate of the test imagesis high (e.g., over 0.9), the performance of our TR-WOPT issignificantly better than the compared algorithms.

I. INTRODUCTION

Tensors are high-dimension representations of vectors andmatrices. Many kinds of data in real-world, for example,color images (length × width × RGB channels), videos(length × width × RGB channels × time) and electroen-cephalography (EEG) signals (magnitude×trails×time) aremore than two dimensions. Usually, the high-dimension datais first transformed to vector or matrix, only then can the databe applied to traditional algorithms. In that way, the adjacentstructure information of the original data will be lost, thuslead to redundant space cost and low-efficiency computation[20]. Tensor representation can retain the high dimension ofdata and solve the above problems. Tensor has been studied formore than a century, many methodologies have been proposed[14]. Moreover, tensor has been applied in various researchfield such as signal processing [6], machine learning [2], datacompletion [1], brain-computer interface (BCI) [16], etc..

CANDECOMP/PARAFAC (CP) decomposition [3] andTucker decomposition [22] are the most popular and classictensor decomposition models, of which many theories andapplications have been proposed [21]. In recent years, a new

Corresponding authors: Jianting Cao ([email protected]) and Qibin Zhao([email protected]).

theory system named tensor network has drawn people’s atten-tion and becomes a promising aspect of tensor methodology[5], [7]. One of the most representative tensor decompositionmodels of tensor network is matrix product state (MPS),which is also known as tensor train (TT) decomposition [19].TT decomposition shows high data compression ability andcomputational efficiency. One of the most significant featuresof TT decomposition is that it can overcome the “curse ofdimensionality”, i.e., for an N -dimension tensor, the numberof parameters of Tucker decomposition is exponential in N ,while the number of parameters of TT decomposition is linearin N . Although CP decomposition achieves high compressionability and the number of parameters is also linear in N ,finding the optimal latent factors of CP decomposition is verydifficult. Recently, a new tensor decomposition model namedtensor ring (TR) decomposition which is more generalizedthan TT decomposition has been proposed [27]. Tensor ringowns all the dominant properties TT has and it has a moreflexible model constraint than tensor train. TR relaxes the rankconstraint of TT, thus lead to more interesting properties, e.g.,enhanced compression ability, interpretability of latent factorsand rotational invariance ability of latent factors [25]. Severalpapers have studied and applied TR decomposition in machinelearning fields [23], [4], in which TR shows promising aspectsin representation ability and computational efficiency.

Tensor completion is to recover an incomplete tensor fromthe partially observed entries of the tensor, which has beenapplied in various completion problems such as image/videocompletion [15], [28], compressed sensing [10], link prediction[18], recommendation system [13], etc.. The theoretical keypoint of matrix completion and tensor completion is thelow-rank assumption of data. There exists strong theoreticalsupport and various solutions for solving the low-rank problemof matrices, and the most studied convex relaxation of low-rank matrix is nuclear norm [12]. However, determining therank of a tensor is an NP-hard problem [11]. To solve thisproblem, there are mainly two types of tensor completionmethods, “rank minimization based” approach and “tensordecomposition based” approach [17]. The first approach for-mulates the convex surrogate models of low-rank tensors. Themost representative model of this kind of tensor completion

1071

Proceedings, APSIPA Annual Summit and Conference 2018 12-15 November 2018, Hawaii

978-988-14768-5-2 ©2018 APSIPA APSIPA-ASC 2018

Page 2: Higher-dimension Tensor Completion via Low-rank Tensor ...

approach employs low-rank constraints on the matricization ofevery mode of the incomplete tensor [15]. For an N -dimensionincomplete tensor T , the low-rank tensor completion modelis formulated by:

minX

∑N

n=1‖X(n)‖∗, s.t. PΩ(X ) = PΩ(T ), (1)

where X is the low-rank approximation tensor, ‖ · ‖∗ is thenuclear norm, and PΩ(T ) denotes the entries w.r.t. the setof indices of observed entries represented by Ω. The missingentries of T is approximated by X and the rank of thecompleted tensor X is determined automatically. Differentfrom “rank minimization based” approach, the “tensor de-composition based” approach do not find the low-rank tensordirectly, instead, it firstly finds the tensor decomposition ofthe incomplete data by observed entries, then the latent factorsare used to predict the missing entries. This kind of approachsets the rank of tensor decomposition manually, and theoptimization model is given below:

minG(n)Nn=1

‖W ∗ (T −X (G(n)Nn=1)‖2F , (2)

where ‖·‖F is the Frobenius norm, G(n)Nn=1 is the sequenceof latent factors under consideration and X (G(n)Nn=1) isthe tensor approximated by the latent factors. W is a weighttensor which is the same size as the incomplete tensor, andevery entry of W meets:

wi1i2···iN =

0 if yi1i2···iN is a missing entry,1 if yi1i2···iN is an observed entry.

(3)

More explanations of notations can be found in SectionII.A. Based on different tensor decomposition models, various“tensor decomposition based” approaches have been proposed,e.g., CP weighted optimization [1], weighted tucker [9] andTT weighted optimization [24]. All the methods aim to findthe specific structure of the incomplete data by different kindsof tensor decompositions. However, CP, Tucker and TT basedWOPT algorithms apply tensor decomposition models whichlack of flexibility, this may lead to bad convergence whenconsidering different kinds of data. TR decomposition modelis much more flexible, so TR-based WOPT method can get abetter approximation of incomplete data, thus provide bettercompletion results.

In this paper, we propose a novel tensor completion methodwhich shows good performance in various experimental situ-ations. The main works of this paper are listed below:• Based on the recently proposed TR decomposition, we

propose a new tensor completion algorithm named tensorring weighted optimization (TR-WOPT).

• The TR latent factors are optimized by gradient descentmethod and then they are used to predict the missingentries of the incomplete tensor.

• We conduct several simulation experiments and real-world data experiments. The experiment results showthat our method outperforms the state-of-the-art tensorcompletion algorithms in various situations. We also find

Workshop track - ICLR 2018

Figure 1: The effects of noise corrupted tensor cores. From left to right, each figure shows noisecorruption by adding noise to one specific tensor core.

Tn1

nd · · ·

nk

· · ·n2

= Z1

Zd · · ·

Zk

· · ·Z2

n1

nd · · ·

nk

· · ·n2

r1

r2

rd

rk+1

rk

r3

Figure 2: A graphical representation of tensor ring decomposition.

limited representation ability and flexibility; ii) TT-ranks are bounded by the rank of k-unfoldingmatricization, which might not be optimal; iii) the permutation of data tensor will yield an inconsistentsolution, i.e., TT representations and TT-ranks are sensitive to the order of tensor dimensions. Hence,finding the optimal permutation remains a challenging problem.

In this paper, we introduce a new structure of tensor networks, which can be considered as ageneralization of TT representations. First of all, we relax the condition over TT-ranks, i.e., r1 =rd+1 = 1, leading to an enhanced representation ability. Secondly, the strict ordering of multilinearproducts between cores should be alleviated. Third, the cores should be treated equivalently bymaking the model symmetric. To this end, we add a new connection between the first and the lastcore tensors, yielding a circular tensor products of a set of cores (see Fig. 2). More specifically, weconsider that each tensor element is approximated by performing a trace operation over the sequentialmultilinear products of cores. Since the trace operation ensures a scalar output, r1 = rd+1 = 1 isnot necessary. In addition, the cores can be circularly shifted and treated equivalently due to theproperties of the trace operation. We call this model tensor ring (TR) decomposition and its corestensor ring (TR) representations. To learn TR representations, we firstly develop a non-iterativeTR-SVD algorithm that is similar to TT-SVD algorithm (Oseledets, 2011). To find the optimal lowerTR-ranks, a block-wise ALS algorithms is presented. Finally, we also propose a scalable algorithmby using stochastic gradient descend, which can be applied to handling large-scale datasets.

Another interesting contribution is that we show the intrinsic structure or high order correlationswithin a 2D image can be captured more efficiently than SVD by converting 2D matrix to a higherorder tensor. For example, given an image of size I J , we can apply an appropriate tensorizationoperation (see details in Sec. 5.2) to obtain a fourth order tensor, of which each mode controls onespecific scale of resolution. To demonstrate this, Fig. 1 shows the effects caused by noise corruptionof specific tensor cores. As we can see, the first mode corresponds to the small-scale patches, whilethe 4th-mode corresponds to the large-scale partitions. We have shown in Sec. 5.2 that TR model canrepresent the image more efficiently than the standard SVD.

2 TENSOR RING DECOMPOSITION

The TR decomposition aims to represent a high-order (or multi-dimensional) tensor by a sequenceof 3rd-order tensors that are multiplied circularly. Specifically, let T be a dth-order tensor of sizen1n2 · · ·nd, denoted by T 2 Rn1···nd , TR representation is to decompose it into a sequenceof latent tensors Zk 2 Rrknkrk+1 , k = 1, 2, . . . , d, which can be expressed in an element-wiseform given by

T (i1, i2, . . . , id) = Tr Z1(i1)Z2(i2) · · ·Zd(id) = Tr

(dY

k=1

Zk(ik)

). (1)

2

=<latexit sha1_base64="EXYmbRMXFwjVW/NuRqnR+lLGUxY=">AAAB53icbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0B6EghePLRhbaEPZbCft2s0m7G6EUvoLvHhQ8epf8ua/cdvmoK0PBh7vzTAzL0wF18Z1v53C2vrG5lZxu7Szu7d/UD48etBJphj6LBGJaodUo+ASfcONwHaqkMahwFY4up35rSdUmify3oxTDGI6kDzijBorNW965Ypbdecgq8TLSQVyNHrlr24/YVmM0jBBte54bmqCCVWGM4HTUjfTmFI2ogPsWCppjDqYzA+dkjOr9EmUKFvSkLn6e2JCY63HcWg7Y2qGetmbif95ncxE18GEyzQzKNliUZQJYhIy+5r0uUJmxNgSyhS3txI2pIoyY7Mp2RC85ZdXiX9RrVXd5mWlXsvTKMIJnMI5eHAFdbiDBvjAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MH+FiMiw==</latexit><latexit sha1_base64="EXYmbRMXFwjVW/NuRqnR+lLGUxY=">AAAB53icbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0B6EghePLRhbaEPZbCft2s0m7G6EUvoLvHhQ8epf8ua/cdvmoK0PBh7vzTAzL0wF18Z1v53C2vrG5lZxu7Szu7d/UD48etBJphj6LBGJaodUo+ASfcONwHaqkMahwFY4up35rSdUmify3oxTDGI6kDzijBorNW965Ypbdecgq8TLSQVyNHrlr24/YVmM0jBBte54bmqCCVWGM4HTUjfTmFI2ogPsWCppjDqYzA+dkjOr9EmUKFvSkLn6e2JCY63HcWg7Y2qGetmbif95ncxE18GEyzQzKNliUZQJYhIy+5r0uUJmxNgSyhS3txI2pIoyY7Mp2RC85ZdXiX9RrVXd5mWlXsvTKMIJnMI5eHAFdbiDBvjAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MH+FiMiw==</latexit><latexit sha1_base64="EXYmbRMXFwjVW/NuRqnR+lLGUxY=">AAAB53icbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0B6EghePLRhbaEPZbCft2s0m7G6EUvoLvHhQ8epf8ua/cdvmoK0PBh7vzTAzL0wF18Z1v53C2vrG5lZxu7Szu7d/UD48etBJphj6LBGJaodUo+ASfcONwHaqkMahwFY4up35rSdUmify3oxTDGI6kDzijBorNW965Ypbdecgq8TLSQVyNHrlr24/YVmM0jBBte54bmqCCVWGM4HTUjfTmFI2ogPsWCppjDqYzA+dkjOr9EmUKFvSkLn6e2JCY63HcWg7Y2qGetmbif95ncxE18GEyzQzKNliUZQJYhIy+5r0uUJmxNgSyhS3txI2pIoyY7Mp2RC85ZdXiX9RrVXd5mWlXsvTKMIJnMI5eHAFdbiDBvjAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MH+FiMiw==</latexit>I1<latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="C39OhB+IczRcjLNINXH29e9lt8M=">AAAB2HicbZDNSgMxFIXv1L86Vq1rN8EiuCpTN+pOcOOygmML7VAymTttaCYzJHeEMvQFXLhRfDB3vo3pz0KtBwIf5yTk3hMXSloKgi+vtrW9s7tX3/cPGv7h0XGz8WTz0ggMRa5y04+5RSU1hiRJYb8wyLNYYS+e3i3y3jMaK3P9SLMCo4yPtUyl4OSs7qjZCtrBUmwTOmtowVqj5ucwyUWZoSahuLWDTlBQVHFDUiic+8PSYsHFlI9x4FDzDG1ULcecs3PnJCzNjTua2NL9+aLimbWzLHY3M04T+zdbmP9lg5LS66iSuigJtVh9lJaKUc4WO7NEGhSkZg64MNLNysSEGy7INeO7Djp/N96E8LJ90w4eAqjDKZzBBXTgCm7hHroQgoAEXuDNm3iv3vuqqpq37uwEfsn7+Aap5IoM</latexit><latexit sha1_base64="SNvw3TvjP6bmGuIQqfwJ3I2tXAk=">AAAB3nicbZDNSgMxFIXv+Ftr1erWTbAIrsqMG3UnuNFdRccW2qFk0jttaCYzJHeEUvoIblyo+FjufBvTn4W2Hgh8nJOQe0+cK2nJ97+9tfWNza3t0k55t7K3f1A9rDzZrDACQ5GpzLRiblFJjSFJUtjKDfI0VtiMhzfTvPmMxspMP9IoxyjlfS0TKTg56+GuG3SrNb/uz8RWIVhADRZqdKtfnV4mihQ1CcWtbQd+TtGYG5JC4aTcKSzmXAx5H9sONU/RRuPZqBN26pweSzLjjiY2c3+/GPPU2lEau5spp4Fdzqbmf1m7oOQyGkudF4RazD9KCsUoY9O9WU8aFKRGDrgw0s3KxIAbLsi1U3YlBMsrr0J4Xr+q+/c+lOAYTuAMAriAa7iFBoQgoA8v8AbvnvJevY95W2veorYj+CPv8wcaXovw</latexit><latexit sha1_base64="SNvw3TvjP6bmGuIQqfwJ3I2tXAk=">AAAB3nicbZDNSgMxFIXv+Ftr1erWTbAIrsqMG3UnuNFdRccW2qFk0jttaCYzJHeEUvoIblyo+FjufBvTn4W2Hgh8nJOQe0+cK2nJ97+9tfWNza3t0k55t7K3f1A9rDzZrDACQ5GpzLRiblFJjSFJUtjKDfI0VtiMhzfTvPmMxspMP9IoxyjlfS0TKTg56+GuG3SrNb/uz8RWIVhADRZqdKtfnV4mihQ1CcWtbQd+TtGYG5JC4aTcKSzmXAx5H9sONU/RRuPZqBN26pweSzLjjiY2c3+/GPPU2lEau5spp4Fdzqbmf1m7oOQyGkudF4RazD9KCsUoY9O9WU8aFKRGDrgw0s3KxIAbLsi1U3YlBMsrr0J4Xr+q+/c+lOAYTuAMAriAa7iFBoQgoA8v8AbvnvJevY95W2veorYj+CPv8wcaXovw</latexit><latexit sha1_base64="lL0KpN9PCyEyMBD+4ba7e5xDHg0=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0m8aG8FL3qraGyhDWWznbRLN5uwuxFK6E/w4kHFq//Im//GbZuDtj4YeLw3w8y8MBVcG9f9dkpr6xubW+Xtys7u3v5B9fDoUSeZYuizRCSqE1KNgkv0DTcCO6lCGocC2+H4eua3n1BpnsgHM0kxiOlQ8ogzaqx0f9v3+tWaW3fnIKvEK0gNCrT61a/eIGFZjNIwQbXuem5qgpwqw5nAaaWXaUwpG9Mhdi2VNEYd5PNTp+TMKgMSJcqWNGSu/p7Iaaz1JA5tZ0zNSC97M/E/r5uZ6CrIuUwzg5ItFkWZICYhs7/JgCtkRkwsoUxxeythI6ooMzadig3BW355lfgX9UbdvXNrzUaRRhlO4BTOwYNLaMINtMAHBkN4hld4c4Tz4rw7H4vWklPMHMMfOJ8/LcONNw==</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit>

I1<latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="C39OhB+IczRcjLNINXH29e9lt8M=">AAAB2HicbZDNSgMxFIXv1L86Vq1rN8EiuCpTN+pOcOOygmML7VAymTttaCYzJHeEMvQFXLhRfDB3vo3pz0KtBwIf5yTk3hMXSloKgi+vtrW9s7tX3/cPGv7h0XGz8WTz0ggMRa5y04+5RSU1hiRJYb8wyLNYYS+e3i3y3jMaK3P9SLMCo4yPtUyl4OSs7qjZCtrBUmwTOmtowVqj5ucwyUWZoSahuLWDTlBQVHFDUiic+8PSYsHFlI9x4FDzDG1ULcecs3PnJCzNjTua2NL9+aLimbWzLHY3M04T+zdbmP9lg5LS66iSuigJtVh9lJaKUc4WO7NEGhSkZg64MNLNysSEGy7INeO7Djp/N96E8LJ90w4eAqjDKZzBBXTgCm7hHroQgoAEXuDNm3iv3vuqqpq37uwEfsn7+Aap5IoM</latexit><latexit sha1_base64="SNvw3TvjP6bmGuIQqfwJ3I2tXAk=">AAAB3nicbZDNSgMxFIXv+Ftr1erWTbAIrsqMG3UnuNFdRccW2qFk0jttaCYzJHeEUvoIblyo+FjufBvTn4W2Hgh8nJOQe0+cK2nJ97+9tfWNza3t0k55t7K3f1A9rDzZrDACQ5GpzLRiblFJjSFJUtjKDfI0VtiMhzfTvPmMxspMP9IoxyjlfS0TKTg56+GuG3SrNb/uz8RWIVhADRZqdKtfnV4mihQ1CcWtbQd+TtGYG5JC4aTcKSzmXAx5H9sONU/RRuPZqBN26pweSzLjjiY2c3+/GPPU2lEau5spp4Fdzqbmf1m7oOQyGkudF4RazD9KCsUoY9O9WU8aFKRGDrgw0s3KxIAbLsi1U3YlBMsrr0J4Xr+q+/c+lOAYTuAMAriAa7iFBoQgoA8v8AbvnvJevY95W2veorYj+CPv8wcaXovw</latexit><latexit sha1_base64="SNvw3TvjP6bmGuIQqfwJ3I2tXAk=">AAAB3nicbZDNSgMxFIXv+Ftr1erWTbAIrsqMG3UnuNFdRccW2qFk0jttaCYzJHeEUvoIblyo+FjufBvTn4W2Hgh8nJOQe0+cK2nJ97+9tfWNza3t0k55t7K3f1A9rDzZrDACQ5GpzLRiblFJjSFJUtjKDfI0VtiMhzfTvPmMxspMP9IoxyjlfS0TKTg56+GuG3SrNb/uz8RWIVhADRZqdKtfnV4mihQ1CcWtbQd+TtGYG5JC4aTcKSzmXAx5H9sONU/RRuPZqBN26pweSzLjjiY2c3+/GPPU2lEau5spp4Fdzqbmf1m7oOQyGkudF4RazD9KCsUoY9O9WU8aFKRGDrgw0s3KxIAbLsi1U3YlBMsrr0J4Xr+q+/c+lOAYTuAMAriAa7iFBoQgoA8v8AbvnvJevY95W2veorYj+CPv8wcaXovw</latexit><latexit sha1_base64="lL0KpN9PCyEyMBD+4ba7e5xDHg0=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0m8aG8FL3qraGyhDWWznbRLN5uwuxFK6E/w4kHFq//Im//GbZuDtj4YeLw3w8y8MBVcG9f9dkpr6xubW+Xtys7u3v5B9fDoUSeZYuizRCSqE1KNgkv0DTcCO6lCGocC2+H4eua3n1BpnsgHM0kxiOlQ8ogzaqx0f9v3+tWaW3fnIKvEK0gNCrT61a/eIGFZjNIwQbXuem5qgpwqw5nAaaWXaUwpG9Mhdi2VNEYd5PNTp+TMKgMSJcqWNGSu/p7Iaaz1JA5tZ0zNSC97M/E/r5uZ6CrIuUwzg5ItFkWZICYhs7/JgCtkRkwsoUxxeythI6ooMzadig3BW355lfgX9UbdvXNrzUaRRhlO4BTOwYNLaMINtMAHBkN4hld4c4Tz4rw7H4vWklPMHMMfOJ8/LcONNw==</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit><latexit sha1_base64="GYe7UYdJyCNjDoqldyETqotC+2w=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+15vUrVrbkzkGXiFaQKBZq9yle3n7AsRmmYoFp3PDc1QU6V4UzgpNzNNKaUjegAO5ZKGqMO8tmpE3JqlT6JEmVLGjJTf0/kNNZ6HIe2M6ZmqBe9qfif18lMdBXkXKaZQcnmi6JMEJOQ6d+kzxUyI8aWUKa4vZWwIVWUGZtO2YbgLb68TPzzWr3m3l1UG/UijRIcwwmcgQeX0IAbaIIPDAbwDK/w5gjnxXl3PuatK04xcwR/4Hz+AC8DjTs=</latexit>

I2<latexit sha1_base64="5LRvO5i0sC0Ty7icNm9FgIM00Ms=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoL0VvOitorGFNpTNdtIu3WzC7kYopT/BiwcVr/4jb/4bt20O2vpg4PHeDDPzwlRwbVz32ymsrW9sbhW3Szu7e/sH5cOjR51kiqHPEpGodkg1Ci7RN9wIbKcKaRwKbIWj65nfekKleSIfzDjFIKYDySPOqLHS/W2v1itX3Ko7B1klXk4qkKPZK391+wnLYpSGCap1x3NTE0yoMpwJnJa6mcaUshEdYMdSSWPUwWR+6pScWaVPokTZkobM1d8TExprPY5D2xlTM9TL3kz8z+tkJroKJlymmUHJFouiTBCTkNnfpM8VMiPGllCmuL2VsCFVlBmbTsmG4C2/vEr8WrVede8uKo16nkYRTuAUzsGDS2jADTTBBwYDeIZXeHOE8+K8Ox+L1oKTzxzDHzifPzCGjTw=</latexit><latexit sha1_base64="5LRvO5i0sC0Ty7icNm9FgIM00Ms=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoL0VvOitorGFNpTNdtIu3WzC7kYopT/BiwcVr/4jb/4bt20O2vpg4PHeDDPzwlRwbVz32ymsrW9sbhW3Szu7e/sH5cOjR51kiqHPEpGodkg1Ci7RN9wIbKcKaRwKbIWj65nfekKleSIfzDjFIKYDySPOqLHS/W2v1itX3Ko7B1klXk4qkKPZK391+wnLYpSGCap1x3NTE0yoMpwJnJa6mcaUshEdYMdSSWPUwWR+6pScWaVPokTZkobM1d8TExprPY5D2xlTM9TL3kz8z+tkJroKJlymmUHJFouiTBCTkNnfpM8VMiPGllCmuL2VsCFVlBmbTsmG4C2/vEr8WrVede8uKo16nkYRTuAUzsGDS2jADTTBBwYDeIZXeHOE8+K8Ox+L1oKTzxzDHzifPzCGjTw=</latexit><latexit sha1_base64="5LRvO5i0sC0Ty7icNm9FgIM00Ms=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoL0VvOitorGFNpTNdtIu3WzC7kYopT/BiwcVr/4jb/4bt20O2vpg4PHeDDPzwlRwbVz32ymsrW9sbhW3Szu7e/sH5cOjR51kiqHPEpGodkg1Ci7RN9wIbKcKaRwKbIWj65nfekKleSIfzDjFIKYDySPOqLHS/W2v1itX3Ko7B1klXk4qkKPZK391+wnLYpSGCap1x3NTE0yoMpwJnJa6mcaUshEdYMdSSWPUwWR+6pScWaVPokTZkobM1d8TExprPY5D2xlTM9TL3kz8z+tkJroKJlymmUHJFouiTBCTkNnfpM8VMiPGllCmuL2VsCFVlBmbTsmG4C2/vEr8WrVede8uKo16nkYRTuAUzsGDS2jADTTBBwYDeIZXeHOE8+K8Ox+L1oKTzxzDHzifPzCGjTw=</latexit>

I2<latexit sha1_base64="5LRvO5i0sC0Ty7icNm9FgIM00Ms=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoL0VvOitorGFNpTNdtIu3WzC7kYopT/BiwcVr/4jb/4bt20O2vpg4PHeDDPzwlRwbVz32ymsrW9sbhW3Szu7e/sH5cOjR51kiqHPEpGodkg1Ci7RN9wIbKcKaRwKbIWj65nfekKleSIfzDjFIKYDySPOqLHS/W2v1itX3Ko7B1klXk4qkKPZK391+wnLYpSGCap1x3NTE0yoMpwJnJa6mcaUshEdYMdSSWPUwWR+6pScWaVPokTZkobM1d8TExprPY5D2xlTM9TL3kz8z+tkJroKJlymmUHJFouiTBCTkNnfpM8VMiPGllCmuL2VsCFVlBmbTsmG4C2/vEr8WrVede8uKo16nkYRTuAUzsGDS2jADTTBBwYDeIZXeHOE8+K8Ox+L1oKTzxzDHzifPzCGjTw=</latexit><latexit sha1_base64="5LRvO5i0sC0Ty7icNm9FgIM00Ms=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoL0VvOitorGFNpTNdtIu3WzC7kYopT/BiwcVr/4jb/4bt20O2vpg4PHeDDPzwlRwbVz32ymsrW9sbhW3Szu7e/sH5cOjR51kiqHPEpGodkg1Ci7RN9wIbKcKaRwKbIWj65nfekKleSIfzDjFIKYDySPOqLHS/W2v1itX3Ko7B1klXk4qkKPZK391+wnLYpSGCap1x3NTE0yoMpwJnJa6mcaUshEdYMdSSWPUwWR+6pScWaVPokTZkobM1d8TExprPY5D2xlTM9TL3kz8z+tkJroKJlymmUHJFouiTBCTkNnfpM8VMiPGllCmuL2VsCFVlBmbTsmG4C2/vEr8WrVede8uKo16nkYRTuAUzsGDS2jADTTBBwYDeIZXeHOE8+K8Ox+L1oKTzxzDHzifPzCGjTw=</latexit><latexit sha1_base64="5LRvO5i0sC0Ty7icNm9FgIM00Ms=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoL0VvOitorGFNpTNdtIu3WzC7kYopT/BiwcVr/4jb/4bt20O2vpg4PHeDDPzwlRwbVz32ymsrW9sbhW3Szu7e/sH5cOjR51kiqHPEpGodkg1Ci7RN9wIbKcKaRwKbIWj65nfekKleSIfzDjFIKYDySPOqLHS/W2v1itX3Ko7B1klXk4qkKPZK391+wnLYpSGCap1x3NTE0yoMpwJnJa6mcaUshEdYMdSSWPUwWR+6pScWaVPokTZkobM1d8TExprPY5D2xlTM9TL3kz8z+tkJroKJlymmUHJFouiTBCTkNnfpM8VMiPGllCmuL2VsCFVlBmbTsmG4C2/vEr8WrVede8uKo16nkYRTuAUzsGDS2jADTTBBwYDeIZXeHOE8+K8Ox+L1oKTzxzDHzifPzCGjTw=</latexit>

IN<latexit sha1_base64="QVYs27g/9a4Jbga5Q6E5OIMBGPE=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lF0N4KXvQiFY0ttKFstpN26WYTdjdCCf0JXjyoePUfefPfuG1z0NYHA4/3ZpiZFySCa+O6305hZXVtfaO4Wdra3tndK+8fPOo4VQw9FotYtQOqUXCJnuFGYDtRSKNAYCsYXU391hMqzWP5YMYJ+hEdSB5yRo2V7m96t71yxa26M5BlUstJBXI0e+Wvbj9maYTSMEG17tTcxPgZVYYzgZNSN9WYUDaiA+xYKmmE2s9mp07IiVX6JIyVLWnITP09kdFI63EU2M6ImqFe9Kbif14nNeGln3GZpAYlmy8KU0FMTKZ/kz5XyIwYW0KZ4vZWwoZUUWZsOiUbQm3x5WXinVXrVffuvNKo52kU4QiO4RRqcAENuIYmeMBgAM/wCm+OcF6cd+dj3lpw8plD+APn8wda2o1Y</latexit><latexit sha1_base64="QVYs27g/9a4Jbga5Q6E5OIMBGPE=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lF0N4KXvQiFY0ttKFstpN26WYTdjdCCf0JXjyoePUfefPfuG1z0NYHA4/3ZpiZFySCa+O6305hZXVtfaO4Wdra3tndK+8fPOo4VQw9FotYtQOqUXCJnuFGYDtRSKNAYCsYXU391hMqzWP5YMYJ+hEdSB5yRo2V7m96t71yxa26M5BlUstJBXI0e+Wvbj9maYTSMEG17tTcxPgZVYYzgZNSN9WYUDaiA+xYKmmE2s9mp07IiVX6JIyVLWnITP09kdFI63EU2M6ImqFe9Kbif14nNeGln3GZpAYlmy8KU0FMTKZ/kz5XyIwYW0KZ4vZWwoZUUWZsOiUbQm3x5WXinVXrVffuvNKo52kU4QiO4RRqcAENuIYmeMBgAM/wCm+OcF6cd+dj3lpw8plD+APn8wda2o1Y</latexit><latexit sha1_base64="QVYs27g/9a4Jbga5Q6E5OIMBGPE=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lF0N4KXvQiFY0ttKFstpN26WYTdjdCCf0JXjyoePUfefPfuG1z0NYHA4/3ZpiZFySCa+O6305hZXVtfaO4Wdra3tndK+8fPOo4VQw9FotYtQOqUXCJnuFGYDtRSKNAYCsYXU391hMqzWP5YMYJ+hEdSB5yRo2V7m96t71yxa26M5BlUstJBXI0e+Wvbj9maYTSMEG17tTcxPgZVYYzgZNSN9WYUDaiA+xYKmmE2s9mp07IiVX6JIyVLWnITP09kdFI63EU2M6ImqFe9Kbif14nNeGln3GZpAYlmy8KU0FMTKZ/kz5XyIwYW0KZ4vZWwoZUUWZsOiUbQm3x5WXinVXrVffuvNKo52kU4QiO4RRqcAENuIYmeMBgAM/wCm+OcF6cd+dj3lpw8plD+APn8wda2o1Y</latexit>

IN<latexit sha1_base64="QVYs27g/9a4Jbga5Q6E5OIMBGPE=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lF0N4KXvQiFY0ttKFstpN26WYTdjdCCf0JXjyoePUfefPfuG1z0NYHA4/3ZpiZFySCa+O6305hZXVtfaO4Wdra3tndK+8fPOo4VQw9FotYtQOqUXCJnuFGYDtRSKNAYCsYXU391hMqzWP5YMYJ+hEdSB5yRo2V7m96t71yxa26M5BlUstJBXI0e+Wvbj9maYTSMEG17tTcxPgZVYYzgZNSN9WYUDaiA+xYKmmE2s9mp07IiVX6JIyVLWnITP09kdFI63EU2M6ImqFe9Kbif14nNeGln3GZpAYlmy8KU0FMTKZ/kz5XyIwYW0KZ4vZWwoZUUWZsOiUbQm3x5WXinVXrVffuvNKo52kU4QiO4RRqcAENuIYmeMBgAM/wCm+OcF6cd+dj3lpw8plD+APn8wda2o1Y</latexit><latexit sha1_base64="QVYs27g/9a4Jbga5Q6E5OIMBGPE=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lF0N4KXvQiFY0ttKFstpN26WYTdjdCCf0JXjyoePUfefPfuG1z0NYHA4/3ZpiZFySCa+O6305hZXVtfaO4Wdra3tndK+8fPOo4VQw9FotYtQOqUXCJnuFGYDtRSKNAYCsYXU391hMqzWP5YMYJ+hEdSB5yRo2V7m96t71yxa26M5BlUstJBXI0e+Wvbj9maYTSMEG17tTcxPgZVYYzgZNSN9WYUDaiA+xYKmmE2s9mp07IiVX6JIyVLWnITP09kdFI63EU2M6ImqFe9Kbif14nNeGln3GZpAYlmy8KU0FMTKZ/kz5XyIwYW0KZ4vZWwoZUUWZsOiUbQm3x5WXinVXrVffuvNKo52kU4QiO4RRqcAENuIYmeMBgAM/wCm+OcF6cd+dj3lpw8plD+APn8wda2o1Y</latexit><latexit sha1_base64="QVYs27g/9a4Jbga5Q6E5OIMBGPE=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lF0N4KXvQiFY0ttKFstpN26WYTdjdCCf0JXjyoePUfefPfuG1z0NYHA4/3ZpiZFySCa+O6305hZXVtfaO4Wdra3tndK+8fPOo4VQw9FotYtQOqUXCJnuFGYDtRSKNAYCsYXU391hMqzWP5YMYJ+hEdSB5yRo2V7m96t71yxa26M5BlUstJBXI0e+Wvbj9maYTSMEG17tTcxPgZVYYzgZNSN9WYUDaiA+xYKmmE2s9mp07IiVX6JIyVLWnITP09kdFI63EU2M6ImqFe9Kbif14nNeGln3GZpAYlmy8KU0FMTKZ/kz5XyIwYW0KZ4vZWwoZUUWZsOiUbQm3x5WXinVXrVffuvNKo52kU4QiO4RRqcAENuIYmeMBgAM/wCm+OcF6cd+dj3lpw8plD+APn8wda2o1Y</latexit>

In<latexit sha1_base64="Vz3oijCeKvUFwa+Eml0ScRB70i8=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+3JXqXq1twZyDLxClKFAs1e5avbT1gWozRMUK07npuaIKfKcCZwUu5mGlPKRnSAHUsljVEH+ezUCTm1Sp9EibIlDZmpvydyGms9jkPbGVMz1IveVPzP62QmugpyLtPMoGTzRVEmiEnI9G/S5wqZEWNLKFPc3krYkCrKjE2nbEPwFl9eJv55rV5z7y6qjXqRRgmO4QTOwINLaMANNMEHBgN4hld4c4Tz4rw7H/PWFaeYOYI/cD5/AIs6jXg=</latexit><latexit sha1_base64="Vz3oijCeKvUFwa+Eml0ScRB70i8=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+3JXqXq1twZyDLxClKFAs1e5avbT1gWozRMUK07npuaIKfKcCZwUu5mGlPKRnSAHUsljVEH+ezUCTm1Sp9EibIlDZmpvydyGms9jkPbGVMz1IveVPzP62QmugpyLtPMoGTzRVEmiEnI9G/S5wqZEWNLKFPc3krYkCrKjE2nbEPwFl9eJv55rV5z7y6qjXqRRgmO4QTOwINLaMANNMEHBgN4hld4c4Tz4rw7H/PWFaeYOYI/cD5/AIs6jXg=</latexit><latexit sha1_base64="Vz3oijCeKvUFwa+Eml0ScRB70i8=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+3JXqXq1twZyDLxClKFAs1e5avbT1gWozRMUK07npuaIKfKcCZwUu5mGlPKRnSAHUsljVEH+ezUCTm1Sp9EibIlDZmpvydyGms9jkPbGVMz1IveVPzP62QmugpyLtPMoGTzRVEmiEnI9G/S5wqZEWNLKFPc3krYkCrKjE2nbEPwFl9eJv55rV5z7y6qjXqRRgmO4QTOwINLaMANNMEHBgN4hld4c4Tz4rw7H/PWFaeYOYI/cD5/AIs6jXg=</latexit>

In<latexit sha1_base64="Vz3oijCeKvUFwa+Eml0ScRB70i8=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+3JXqXq1twZyDLxClKFAs1e5avbT1gWozRMUK07npuaIKfKcCZwUu5mGlPKRnSAHUsljVEH+ezUCTm1Sp9EibIlDZmpvydyGms9jkPbGVMz1IveVPzP62QmugpyLtPMoGTzRVEmiEnI9G/S5wqZEWNLKFPc3krYkCrKjE2nbEPwFl9eJv55rV5z7y6qjXqRRgmO4QTOwINLaMANNMEHBgN4hld4c4Tz4rw7H/PWFaeYOYI/cD5/AIs6jXg=</latexit><latexit sha1_base64="Vz3oijCeKvUFwa+Eml0ScRB70i8=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+3JXqXq1twZyDLxClKFAs1e5avbT1gWozRMUK07npuaIKfKcCZwUu5mGlPKRnSAHUsljVEH+ezUCTm1Sp9EibIlDZmpvydyGms9jkPbGVMz1IveVPzP62QmugpyLtPMoGTzRVEmiEnI9G/S5wqZEWNLKFPc3krYkCrKjE2nbEPwFl9eJv55rV5z7y6qjXqRRgmO4QTOwINLaMANNMEHBgN4hld4c4Tz4rw7H/PWFaeYOYI/cD5/AIs6jXg=</latexit><latexit sha1_base64="Vz3oijCeKvUFwa+Eml0ScRB70i8=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4KokI2lvBi94qGltoQ9lsJ+3SzSbsboQS+hO8eFDx6j/y5r9x2+agrQ8GHu/NMDMvTAXXxnW/nZXVtfWNzdJWeXtnd2+/cnD4qJNMMfRZIhLVDqlGwSX6hhuB7VQhjUOBrXB0PfVbT6g0T+SDGacYxHQgecQZNVa6v+3JXqXq1twZyDLxClKFAs1e5avbT1gWozRMUK07npuaIKfKcCZwUu5mGlPKRnSAHUsljVEH+ezUCTm1Sp9EibIlDZmpvydyGms9jkPbGVMz1IveVPzP62QmugpyLtPMoGTzRVEmiEnI9G/S5wqZEWNLKFPc3krYkCrKjE2nbEPwFl9eJv55rV5z7y6qjXqRRgmO4QTOwINLaMANNMEHBgN4hld4c4Tz4rw7H/PWFaeYOYI/cD5/AIs6jXg=</latexit>

· · ·<latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit><latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit><latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit>

· · ·<latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit><latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit><latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit>

· · ·<latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit><latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit><latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit>

· · ·<latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit><latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit><latexit sha1_base64="71MNtDplGt4aEzD9uEEQp2QnnHY=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lVQE7a3gxWMFYwttKJvNpl262Q27E6GE/ggvHlS8+n+8+W/ctDlo64OBx3szzMwLU8ENet63s7a+sbm1Xdmp7u7tHxzWjo4fjco0ZT5VQuleSAwTXDIfOQrWSzUjSShYN5zcFn73iWnDlXzAacqChIwkjzklaKXugEYKTXVYq3sNbw53lTRLUocSnWHtaxApmiVMIhXEmH7TSzHIiUZOBZtVB5lhKaETMmJ9SyVJmAny+bkz99wqkRsrbUuiO1d/T+QkMWaahLYzITg2y14h/uf1M4xvgpzLNEMm6WJRnAkXlVv87kZcM4piagmhmttbXTommlC0CRUhNJdfXiX+ZaPV8O6v6u1WmUYFTuEMLqAJ19CGO+iADxQm8Ayv8Oakzovz7nwsWteccuYE/sD5/AFONI8N</latexit>

· · ·<latexit sha1_base64="F9nsRaTe7ujR44aLdk07GVFawH8=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69BIvgqaQiqBcpePFYwdhCG8pms2mXbnbD7kQooT/CiwcVr/4fb/4bN20OWn0w8Hhvhpl5YSq4Qc/7ciorq2vrG9XN2tb2zu5eff/gwahMU+ZTJZTuhcQwwSXzkaNgvVQzkoSCdcPJTeF3H5k2XMl7nKYsSMhI8phTglbqDmik0NSG9YbX9OZw/5JWSRpQojOsfw4iRbOESaSCGNNveSkGOdHIqWCz2iAzLCV0Qkasb6kkCTNBPj935p5YJXJjpW1JdOfqz4mcJMZMk9B2JgTHZtkrxP+8fobxZZBzmWbIJF0sijPhonKL392Ia0ZRTC0hVHN7q0vHRBOKNqEihNbyy3+Jf9a8anp35432dZlGFY7gGE6hBRfQhlvogA8UJvAEL/DqpM6z8+a8L1orTjlzCL/gfHwDT7WPEg==</latexit><latexit sha1_base64="F9nsRaTe7ujR44aLdk07GVFawH8=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69BIvgqaQiqBcpePFYwdhCG8pms2mXbnbD7kQooT/CiwcVr/4fb/4bN20OWn0w8Hhvhpl5YSq4Qc/7ciorq2vrG9XN2tb2zu5eff/gwahMU+ZTJZTuhcQwwSXzkaNgvVQzkoSCdcPJTeF3H5k2XMl7nKYsSMhI8phTglbqDmik0NSG9YbX9OZw/5JWSRpQojOsfw4iRbOESaSCGNNveSkGOdHIqWCz2iAzLCV0Qkasb6kkCTNBPj935p5YJXJjpW1JdOfqz4mcJMZMk9B2JgTHZtkrxP+8fobxZZBzmWbIJF0sijPhonKL392Ia0ZRTC0hVHN7q0vHRBOKNqEihNbyy3+Jf9a8anp35432dZlGFY7gGE6hBRfQhlvogA8UJvAEL/DqpM6z8+a8L1orTjlzCL/gfHwDT7WPEg==</latexit><latexit sha1_base64="F9nsRaTe7ujR44aLdk07GVFawH8=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69BIvgqaQiqBcpePFYwdhCG8pms2mXbnbD7kQooT/CiwcVr/4fb/4bN20OWn0w8Hhvhpl5YSq4Qc/7ciorq2vrG9XN2tb2zu5eff/gwahMU+ZTJZTuhcQwwSXzkaNgvVQzkoSCdcPJTeF3H5k2XMl7nKYsSMhI8phTglbqDmik0NSG9YbX9OZw/5JWSRpQojOsfw4iRbOESaSCGNNveSkGOdHIqWCz2iAzLCV0Qkasb6kkCTNBPj935p5YJXJjpW1JdOfqz4mcJMZMk9B2JgTHZtkrxP+8fobxZZBzmWbIJF0sijPhonKL392Ia0ZRTC0hVHN7q0vHRBOKNqEihNbyy3+Jf9a8anp35432dZlGFY7gGE6hBRfQhlvogA8UJvAEL/DqpM6z8+a8L1orTjlzCL/gfHwDT7WPEg==</latexit>

G1<latexit sha1_base64="Q9q0uwUrDmQOUuRrqNt0XHBqGp4=">AAAB9HicbVBNSwMxFHxbv2r9qnr0EiyCp7IrgnqRggc9VnBtoV1LNs22odlkSbJKWfo/vHhQ8eqP8ea/MdvuQVsHAsPMe7zJhAln2rjut1NaWl5ZXSuvVzY2t7Z3qrt791qmilCfSC5VO8Saciaob5jhtJ0oiuOQ01Y4usr91iNVmklxZ8YJDWI8ECxiBBsrPXRjbIYE8+x60vMqvWrNrbtToEXiFaQGBZq96le3L0kaU2EIx1p3PDcxQYaVYYTTSaWbappgMsID2rFU4JjqIJumnqAjq/RRJJV9wqCp+nsjw7HW4zi0k3lKPe/l4n9eJzXReZAxkaSGCjI7FKUcGYnyClCfKUoMH1uCiWI2KyJDrDAxtqi8BG/+y4vEP6lf1N3b01rjsmijDAdwCMfgwRk04Aaa4AMBBc/wCm/Ok/PivDsfs9GSU+zswx84nz9IW5Hk</latexit><latexit sha1_base64="Q9q0uwUrDmQOUuRrqNt0XHBqGp4=">AAAB9HicbVBNSwMxFHxbv2r9qnr0EiyCp7IrgnqRggc9VnBtoV1LNs22odlkSbJKWfo/vHhQ8eqP8ea/MdvuQVsHAsPMe7zJhAln2rjut1NaWl5ZXSuvVzY2t7Z3qrt791qmilCfSC5VO8Saciaob5jhtJ0oiuOQ01Y4usr91iNVmklxZ8YJDWI8ECxiBBsrPXRjbIYE8+x60vMqvWrNrbtToEXiFaQGBZq96le3L0kaU2EIx1p3PDcxQYaVYYTTSaWbappgMsID2rFU4JjqIJumnqAjq/RRJJV9wqCp+nsjw7HW4zi0k3lKPe/l4n9eJzXReZAxkaSGCjI7FKUcGYnyClCfKUoMH1uCiWI2KyJDrDAxtqi8BG/+y4vEP6lf1N3b01rjsmijDAdwCMfgwRk04Aaa4AMBBc/wCm/Ok/PivDsfs9GSU+zswx84nz9IW5Hk</latexit><latexit sha1_base64="Q9q0uwUrDmQOUuRrqNt0XHBqGp4=">AAAB9HicbVBNSwMxFHxbv2r9qnr0EiyCp7IrgnqRggc9VnBtoV1LNs22odlkSbJKWfo/vHhQ8eqP8ea/MdvuQVsHAsPMe7zJhAln2rjut1NaWl5ZXSuvVzY2t7Z3qrt791qmilCfSC5VO8Saciaob5jhtJ0oiuOQ01Y4usr91iNVmklxZ8YJDWI8ECxiBBsrPXRjbIYE8+x60vMqvWrNrbtToEXiFaQGBZq96le3L0kaU2EIx1p3PDcxQYaVYYTTSaWbappgMsID2rFU4JjqIJumnqAjq/RRJJV9wqCp+nsjw7HW4zi0k3lKPe/l4n9eJzXReZAxkaSGCjI7FKUcGYnyClCfKUoMH1uCiWI2KyJDrDAxtqi8BG/+y4vEP6lf1N3b01rjsmijDAdwCMfgwRk04Aaa4AMBBc/wCm/Ok/PivDsfs9GSU+zswx84nz9IW5Hk</latexit>

G2<latexit sha1_base64="S/xPvMskHXI1CRHpwRrWy4JlwpQ=">AAAB9HicbVBNSwMxFHzrZ61fVY9egkXwVHaLoF6k4EGPFVxbaNeSTbNtaDZZkqxSlv4PLx5UvPpjvPlvzLZ70NaBwDDzHm8yYcKZNq777Swtr6yurZc2yptb2zu7lb39ey1TRahPJJeqHWJNORPUN8xw2k4UxXHIaSscXeV+65EqzaS4M+OEBjEeCBYxgo2VHroxNkOCeXY96dXLvUrVrblToEXiFaQKBZq9yle3L0kaU2EIx1p3PDcxQYaVYYTTSbmbappgMsID2rFU4JjqIJumnqBjq/RRJJV9wqCp+nsjw7HW4zi0k3lKPe/l4n9eJzXReZAxkaSGCjI7FKUcGYnyClCfKUoMH1uCiWI2KyJDrDAxtqi8BG/+y4vEr9cuau7tabVxWbRRgkM4ghPw4AwacANN8IGAgmd4hTfnyXlx3p2P2eiSU+wcwB84nz9J35Hl</latexit><latexit sha1_base64="S/xPvMskHXI1CRHpwRrWy4JlwpQ=">AAAB9HicbVBNSwMxFHzrZ61fVY9egkXwVHaLoF6k4EGPFVxbaNeSTbNtaDZZkqxSlv4PLx5UvPpjvPlvzLZ70NaBwDDzHm8yYcKZNq777Swtr6yurZc2yptb2zu7lb39ey1TRahPJJeqHWJNORPUN8xw2k4UxXHIaSscXeV+65EqzaS4M+OEBjEeCBYxgo2VHroxNkOCeXY96dXLvUrVrblToEXiFaQKBZq9yle3L0kaU2EIx1p3PDcxQYaVYYTTSbmbappgMsID2rFU4JjqIJumnqBjq/RRJJV9wqCp+nsjw7HW4zi0k3lKPe/l4n9eJzXReZAxkaSGCjI7FKUcGYnyClCfKUoMH1uCiWI2KyJDrDAxtqi8BG/+y4vEr9cuau7tabVxWbRRgkM4ghPw4AwacANN8IGAgmd4hTfnyXlx3p2P2eiSU+wcwB84nz9J35Hl</latexit><latexit sha1_base64="S/xPvMskHXI1CRHpwRrWy4JlwpQ=">AAAB9HicbVBNSwMxFHzrZ61fVY9egkXwVHaLoF6k4EGPFVxbaNeSTbNtaDZZkqxSlv4PLx5UvPpjvPlvzLZ70NaBwDDzHm8yYcKZNq777Swtr6yurZc2yptb2zu7lb39ey1TRahPJJeqHWJNORPUN8xw2k4UxXHIaSscXeV+65EqzaS4M+OEBjEeCBYxgo2VHroxNkOCeXY96dXLvUrVrblToEXiFaQKBZq9yle3L0kaU2EIx1p3PDcxQYaVYYTTSbmbappgMsID2rFU4JjqIJumnqBjq/RRJJV9wqCp+nsjw7HW4zi0k3lKPe/l4n9eJzXReZAxkaSGCjI7FKUcGYnyClCfKUoMH1uCiWI2KyJDrDAxtqi8BG/+y4vEr9cuau7tabVxWbRRgkM4ghPw4AwacANN8IGAgmd4hTfnyXlx3p2P2eiSU+wcwB84nz9J35Hl</latexit>

GN<latexit sha1_base64="bfZduSSbGibV8GN2/Z8tXR4zjYg=">AAAB9HicbVBNSwMxFHypX7V+VT16CRbBU9kVQb1IwYOepIJrC+1asmm2Dc1mlySrlKX/w4sHFa/+GG/+G7PtHrQ6EBhm3uNNJkgE18ZxvlBpYXFpeaW8Wllb39jcqm7v3Ok4VZR5NBaxagdEM8El8ww3grUTxUgUCNYKRhe533pgSvNY3ppxwvyIDCQPOSXGSvfdiJghJSK7nPSuK71qzak7U+C/xC1IDQo0e9XPbj+macSkoYJo3XGdxPgZUYZTwSaVbqpZQuiIDFjHUkkipv1smnqCD6zSx2Gs7JMGT9WfGxmJtB5HgZ3MU+p5Lxf/8zqpCU/9jMskNUzS2aEwFdjEOK8A97li1IixJYQqbrNiOiSKUGOLyktw57/8l3hH9bO6c3Nca5wXbZRhD/bhEFw4gQZcQRM8oKDgCV7gFT2iZ/SG3mejJVTs7MIvoI9vdE+SAQ==</latexit><latexit sha1_base64="bfZduSSbGibV8GN2/Z8tXR4zjYg=">AAAB9HicbVBNSwMxFHypX7V+VT16CRbBU9kVQb1IwYOepIJrC+1asmm2Dc1mlySrlKX/w4sHFa/+GG/+G7PtHrQ6EBhm3uNNJkgE18ZxvlBpYXFpeaW8Wllb39jcqm7v3Ok4VZR5NBaxagdEM8El8ww3grUTxUgUCNYKRhe533pgSvNY3ppxwvyIDCQPOSXGSvfdiJghJSK7nPSuK71qzak7U+C/xC1IDQo0e9XPbj+macSkoYJo3XGdxPgZUYZTwSaVbqpZQuiIDFjHUkkipv1smnqCD6zSx2Gs7JMGT9WfGxmJtB5HgZ3MU+p5Lxf/8zqpCU/9jMskNUzS2aEwFdjEOK8A97li1IixJYQqbrNiOiSKUGOLyktw57/8l3hH9bO6c3Nca5wXbZRhD/bhEFw4gQZcQRM8oKDgCV7gFT2iZ/SG3mejJVTs7MIvoI9vdE+SAQ==</latexit><latexit sha1_base64="bfZduSSbGibV8GN2/Z8tXR4zjYg=">AAAB9HicbVBNSwMxFHypX7V+VT16CRbBU9kVQb1IwYOepIJrC+1asmm2Dc1mlySrlKX/w4sHFa/+GG/+G7PtHrQ6EBhm3uNNJkgE18ZxvlBpYXFpeaW8Wllb39jcqm7v3Ok4VZR5NBaxagdEM8El8ww3grUTxUgUCNYKRhe533pgSvNY3ppxwvyIDCQPOSXGSvfdiJghJSK7nPSuK71qzak7U+C/xC1IDQo0e9XPbj+macSkoYJo3XGdxPgZUYZTwSaVbqpZQuiIDFjHUkkipv1smnqCD6zSx2Gs7JMGT9WfGxmJtB5HgZ3MU+p5Lxf/8zqpCU/9jMskNUzS2aEwFdjEOK8A97li1IixJYQqbrNiOiSKUGOLyktw57/8l3hH9bO6c3Nca5wXbZRhD/bhEFw4gQZcQRM8oKDgCV7gFT2iZ/SG3mejJVTs7MIvoI9vdE+SAQ==</latexit>

Gn<latexit sha1_base64="cs+463K5n0SSaMT1X9i0K+EuRu4=">AAAB9HicbVBNSwMxFHxbv2r9qnr0EiyCp7IrgnqRggc9VnBtoV1LNs22odlkSbJKWfo/vHhQ8eqP8ea/MdvuQVsHAsPMe7zJhAln2rjut1NaWl5ZXSuvVzY2t7Z3qrt791qmilCfSC5VO8Saciaob5jhtJ0oiuOQ01Y4usr91iNVmklxZ8YJDWI8ECxiBBsrPXRjbIYE8+x60hOVXrXm1t0p0CLxClKDAs1e9avblySNqTCEY607npuYIMPKMMLppNJNNU0wGeEB7VgqcEx1kE1TT9CRVfookso+YdBU/b2R4VjrcRzayTylnvdy8T+vk5roPMiYSFJDBZkdilKOjER5BajPFCWGjy3BRDGbFZEhVpgYW1Regjf/5UXin9Qv6u7taa1xWbRRhgM4hGPw4AwacANN8IGAgmd4hTfnyXlx3p2P2WjJKXb24Q+czx+kz5Ih</latexit><latexit sha1_base64="cs+463K5n0SSaMT1X9i0K+EuRu4=">AAAB9HicbVBNSwMxFHxbv2r9qnr0EiyCp7IrgnqRggc9VnBtoV1LNs22odlkSbJKWfo/vHhQ8eqP8ea/MdvuQVsHAsPMe7zJhAln2rjut1NaWl5ZXSuvVzY2t7Z3qrt791qmilCfSC5VO8Saciaob5jhtJ0oiuOQ01Y4usr91iNVmklxZ8YJDWI8ECxiBBsrPXRjbIYE8+x60hOVXrXm1t0p0CLxClKDAs1e9avblySNqTCEY607npuYIMPKMMLppNJNNU0wGeEB7VgqcEx1kE1TT9CRVfookso+YdBU/b2R4VjrcRzayTylnvdy8T+vk5roPMiYSFJDBZkdilKOjER5BajPFCWGjy3BRDGbFZEhVpgYW1Regjf/5UXin9Qv6u7taa1xWbRRhgM4hGPw4AwacANN8IGAgmd4hTfnyXlx3p2P2WjJKXb24Q+czx+kz5Ih</latexit><latexit sha1_base64="cs+463K5n0SSaMT1X9i0K+EuRu4=">AAAB9HicbVBNSwMxFHxbv2r9qnr0EiyCp7IrgnqRggc9VnBtoV1LNs22odlkSbJKWfo/vHhQ8eqP8ea/MdvuQVsHAsPMe7zJhAln2rjut1NaWl5ZXSuvVzY2t7Z3qrt791qmilCfSC5VO8Saciaob5jhtJ0oiuOQ01Y4usr91iNVmklxZ8YJDWI8ECxiBBsrPXRjbIYE8+x60hOVXrXm1t0p0CLxClKDAs1e9avblySNqTCEY607npuYIMPKMMLppNJNNU0wGeEB7VgqcEx1kE1TT9CRVfookso+YdBU/b2R4VjrcRzayTylnvdy8T+vk5roPMiYSFJDBZkdilKOjER5BajPFCWGjy3BRDGbFZEhVpgYW1Regjf/5UXin9Qv6u7taa1xWbRRhgM4hGPw4AwacANN8IGAgmd4hTfnyXlx3p2P2WjJKXb24Q+czx+kz5Ih</latexit>

· · ·<latexit sha1_base64="F9nsRaTe7ujR44aLdk07GVFawH8=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69BIvgqaQiqBcpePFYwdhCG8pms2mXbnbD7kQooT/CiwcVr/4fb/4bN20OWn0w8Hhvhpl5YSq4Qc/7ciorq2vrG9XN2tb2zu5eff/gwahMU+ZTJZTuhcQwwSXzkaNgvVQzkoSCdcPJTeF3H5k2XMl7nKYsSMhI8phTglbqDmik0NSG9YbX9OZw/5JWSRpQojOsfw4iRbOESaSCGNNveSkGOdHIqWCz2iAzLCV0Qkasb6kkCTNBPj935p5YJXJjpW1JdOfqz4mcJMZMk9B2JgTHZtkrxP+8fobxZZBzmWbIJF0sijPhonKL392Ia0ZRTC0hVHN7q0vHRBOKNqEihNbyy3+Jf9a8anp35432dZlGFY7gGE6hBRfQhlvogA8UJvAEL/DqpM6z8+a8L1orTjlzCL/gfHwDT7WPEg==</latexit><latexit sha1_base64="F9nsRaTe7ujR44aLdk07GVFawH8=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69BIvgqaQiqBcpePFYwdhCG8pms2mXbnbD7kQooT/CiwcVr/4fb/4bN20OWn0w8Hhvhpl5YSq4Qc/7ciorq2vrG9XN2tb2zu5eff/gwahMU+ZTJZTuhcQwwSXzkaNgvVQzkoSCdcPJTeF3H5k2XMl7nKYsSMhI8phTglbqDmik0NSG9YbX9OZw/5JWSRpQojOsfw4iRbOESaSCGNNveSkGOdHIqWCz2iAzLCV0Qkasb6kkCTNBPj935p5YJXJjpW1JdOfqz4mcJMZMk9B2JgTHZtkrxP+8fobxZZBzmWbIJF0sijPhonKL392Ia0ZRTC0hVHN7q0vHRBOKNqEihNbyy3+Jf9a8anp35432dZlGFY7gGE6hBRfQhlvogA8UJvAEL/DqpM6z8+a8L1orTjlzCL/gfHwDT7WPEg==</latexit><latexit sha1_base64="F9nsRaTe7ujR44aLdk07GVFawH8=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69BIvgqaQiqBcpePFYwdhCG8pms2mXbnbD7kQooT/CiwcVr/4fb/4bN20OWn0w8Hhvhpl5YSq4Qc/7ciorq2vrG9XN2tb2zu5eff/gwahMU+ZTJZTuhcQwwSXzkaNgvVQzkoSCdcPJTeF3H5k2XMl7nKYsSMhI8phTglbqDmik0NSG9YbX9OZw/5JWSRpQojOsfw4iRbOESaSCGNNveSkGOdHIqWCz2iAzLCV0Qkasb6kkCTNBPj935p5YJXJjpW1JdOfqz4mcJMZMk9B2JgTHZtkrxP+8fobxZZBzmWbIJF0sijPhonKL392Ia0ZRTC0hVHN7q0vHRBOKNqEihNbyy3+Jf9a8anp35432dZlGFY7gGE6hBRfQhlvogA8UJvAEL/DqpM6z8+a8L1orTjlzCL/gfHwDT7WPEg==</latexit>

X<latexit sha1_base64="Z7Lba4lY9L7gpHl4OF7R49qtXOk=">AAAB8nicbVBNS8NAFHzxs9avqkcvi0XwVFIR1IsUvHisYGyhCWWz3bRLN5uw+yKU0L/hxYOKV3+NN/+NmzYHbR1YGGbe481OmEph0HW/nZXVtfWNzcpWdXtnd2+/dnD4aJJMM+6xRCa6G1LDpVDcQ4GSd1PNaRxK3gnHt4XfeeLaiEQ94CTlQUyHSkSCUbSS78cUR4zKvDut9mt1t+HOQJZJsyR1KNHu1778QcKymCtkkhrTa7opBjnVKJjk06qfGZ5SNqZD3rNU0ZibIJ9lnpJTqwxIlGj7FJKZ+nsjp7Exkzi0k0VGs+gV4n9eL8PoKsiFSjPkis0PRZkkmJCiADIQmjOUE0so08JmJWxENWVoaypKaC5+eZl4543rhnt/UW/dlG1U4BhO4AyacAktuIM2eMAghWd4hTcnc16cd+djPrrilDtH8AfO5w80pJFR</latexit><latexit sha1_base64="Z7Lba4lY9L7gpHl4OF7R49qtXOk=">AAAB8nicbVBNS8NAFHzxs9avqkcvi0XwVFIR1IsUvHisYGyhCWWz3bRLN5uw+yKU0L/hxYOKV3+NN/+NmzYHbR1YGGbe481OmEph0HW/nZXVtfWNzcpWdXtnd2+/dnD4aJJMM+6xRCa6G1LDpVDcQ4GSd1PNaRxK3gnHt4XfeeLaiEQ94CTlQUyHSkSCUbSS78cUR4zKvDut9mt1t+HOQJZJsyR1KNHu1778QcKymCtkkhrTa7opBjnVKJjk06qfGZ5SNqZD3rNU0ZibIJ9lnpJTqwxIlGj7FJKZ+nsjp7Exkzi0k0VGs+gV4n9eL8PoKsiFSjPkis0PRZkkmJCiADIQmjOUE0so08JmJWxENWVoaypKaC5+eZl4543rhnt/UW/dlG1U4BhO4AyacAktuIM2eMAghWd4hTcnc16cd+djPrrilDtH8AfO5w80pJFR</latexit><latexit sha1_base64="Z7Lba4lY9L7gpHl4OF7R49qtXOk=">AAAB8nicbVBNS8NAFHzxs9avqkcvi0XwVFIR1IsUvHisYGyhCWWz3bRLN5uw+yKU0L/hxYOKV3+NN/+NmzYHbR1YGGbe481OmEph0HW/nZXVtfWNzcpWdXtnd2+/dnD4aJJMM+6xRCa6G1LDpVDcQ4GSd1PNaRxK3gnHt4XfeeLaiEQ94CTlQUyHSkSCUbSS78cUR4zKvDut9mt1t+HOQJZJsyR1KNHu1778QcKymCtkkhrTa7opBjnVKJjk06qfGZ5SNqZD3rNU0ZibIJ9lnpJTqwxIlGj7FJKZ+nsjp7Exkzi0k0VGs+gV4n9eL8PoKsiFSjPkis0PRZkkmJCiADIQmjOUE0so08JmJWxENWVoaypKaC5+eZl4543rhnt/UW/dlG1U4BhO4AyacAktuIM2eMAghWd4hTcnc16cd+djPrrilDtH8AfO5w80pJFR</latexit>

Figure 1. TR decomposition.

that our method is robust to tensor dimension. When theimage data is tensorized to a proper high-dimension, theperformance of our method can be enhanced.

II. PRELIMINARIES

A. Notations

In this paper, a scalar is denoted by a normal lowercaseor capital letter, e.g., x ∈ R and X ∈ R. A vector isdenoted by a boldface lowercase letter, e.g., x ∈ RI . Amatrix is denoted by a boldface capital letter, e.g., X ∈RI×J . A tensor of dimension N ≥ 3 is denoted by Eulerscript letters, e.g., X ∈ RI1×I2×···×IN . For n = 1, · · · , N ,X (n)Nn=1 := X (1),X (2), · · · ,X (N) is defined as atensor sequence, and X (n) is the nth tensor of the se-quence. The representations of matrix sequence and vectorsequence are denoted in the same way. An element of tensorX ∈ RI1×I2×···×IN of index i1, i2, . . . , iN is denotedby xi1i2···iN or X (i1, i2, . . . , iN ). One type of the mode-nmatricization (unfolding) of tensor X ∈ RI1×I2×···×IN isdenoted by X(n) ∈ RIn×I1···In−1In+1···IN . Another type ofmode-n matricization of tensor X ∈ RI1×I2×···×IN is denotedby X<n> ∈ RIn×In+1···INI1···In−1 . The inner product of twotensors X , Y with the same size RI1×I2×···×IN is defined as〈X ,Y〉 =

∑i1

∑i2· · ·∑iN

xi1i2···iN yi1i2···iN . Furthermore,the Frobenius norm of X is defined by ‖X‖F =

√〈X ,X 〉.

The Hadamard product is denoted by ‘∗’ and it is an element-wise product of vectors, matrices or tensors of same sizes. Forexample, given tensor X ,Y ∈ RI1×I2×···×IN , Z = X ∗ Y ,then Z ∈ RI1×I2×···×IN , and zi1i2···iN = xi1i2···iN yi1i2···iN .

B. Tensor ring decomposition

Tensor ring (TR) decomposition is the generalization oftensor-train (TT) decomposition. It represents a tensor bycircular multilinear products over a sequence of latent factorsof lower-dimension [27]. The latent factors are named coretensors. All of the core tensors of TR decomposition are three-dimension tensors, and are denoted by G(n) ∈ RRn×In×Rn+1 ,n = 1, . . . , N , where R1, . . . , RN denotes the TR-rank whichcontrols the model complexity of TR decomposition. Therepresentation regulation of tensor network in monography [5]is adopted and the diagram of TR decomposition is shown inFigure 1. Similar as TT, the TR decomposition linearly scalesthe tensor dimension, and thus it can overcome the “curse of

1072

Proceedings, APSIPA Annual Summit and Conference 2018 12-15 November 2018, Hawaii

Page 3: Higher-dimension Tensor Completion via Low-rank Tensor ...

dimensionality”. The TR decomposition relaxes the rank con-straints on the first and last core tensors of TT to R1 = RN+1,while the original constraints on TT decomposition is ratherstringent, i.e., R1 = RN+1 = 1. The rank constraints of TT isbecause of the relation of core tensors and original tensor hasto be deduced by multiple multiplication operation. TR appliestrace operation, so all the core tensors can be constrained to bethird-dimension equivalently. Therefore, TR is considered asa linear combination of TT and it offers a more powerful andgeneralized representation ability than TT decomposition. Theelement-wise relation of the TR core tensors and the originaltensor is given by:

X (i1, i2, . . . , iN ) = Trace( N∏

n=1

G(n)(in)), (4)

where Trace(·) is the matrix trace operator, G(n)(in) ∈RRn×Rn+1 is the inth mode-2 slice matrix of G(n), whichalso can be denoted by G(n)(:, in, :). Every element of thetensor can be calculated by the trace of the matrices multiplemultiplication of the according mode-2 slice of every TR coretensors.

III. TENSOR RING WEIGHTED OPTIMIZATION

Based on TR decomposition, we propose the TR-WOPTalgorithm which is illustrated as follows. Define T ∈RI1×I2×···×IN is the incomplete tensor with missing entriesfilled with zero, X (G(n)Nn=1) is the tensor approximated bythe core tensors of TR decomposition. The proposed algorithmis based on “tensor decomposition based” approach and themodel is described in (2). The model is to find the coretensors of TR decomposition of an incomplete tensor, thenuse the TR core tensors to approximate the missing entries. Tominimize the model by gradient-based algorithm, the problemis reformulated by the below optimization model:

f(G(1), . . . ,G(n)) =1

2

∥∥∥W ∗ (T −X (G(n)Nn=1))∥∥∥

2

F.

(5)This is an objective function of an optimization problem andall the core tensors are the optimization objective. From [27],the relation between the approximated tensor X and the coretensors G(n)Nn=1 can be deduced as the following equation:

X<n> = G(n)(2) (G

( 6=n)<2> )T , (6)

where G(6=n) ∈ RRn+1×∏N

i=1,i 6=n Ii×Rn is a subchain tensorby merging all core tensors except the nth core tensor, i.e.,G(6=n) := X (G(n+1), . . . ,G(n),G(1), . . . ,G(n−1)). Becauseeach of the core tensors is independent, we can optimize themindependently. The optimization function w.r.t. G(n) can bewritten as:

f(G(n)) =1

2

∥∥∥W<n> ∗ (T<n> −G(n)(2) (G

( 6=n)<2> )T )

∥∥∥2

F, (7)

where we consider other tensor cores remain fixed. Next, wecan deduce the partial derivatives of the objective function (7)

w.r.t. G(n)(2) as follow:

∂f

∂G(n)(2)

= (W<n> ∗ (G(n)(2) (G

( 6=n)<2> )T −T<n>)G

( 6=n)<2> . (8)

For n = 1, ..., N , the gradients of all the core tensorscan be obtained, and the core tensors can be optimized byany gradient-based optimization algorithms. Furthermore, ifthere is no missing entries in tensor data, our algorithm canalso be used as a TR decomposition algorithm. The wholeprocess of applying TR-WOPT to tensor completion is listedin Algorithm 1. The details of gradient descent algorithm andparameter settings in this paper will be explained in SectionIV .

Algorithm 1 Tensor ring weighted optimization (TR-WOPT)1: Input: incomplete tensor T , weight tensor W , TR-rank

R1, . . . , RN , and randomly initialized G(n)Nn=1.2: While the stopping condition is not satisfied3: For n=1:N4: Compute gradients of G(n)Nn=1 according to (8).5: End6: Update G(n)Nn=1 by gradient descend algorithm.7: End while8: Y = PΩ(T ) + PΩ(X (G(n)Nn=1))9: Output: completed tensor Y .

IV. EXPERIMENT RESULTS

We conduct several synthetic data experiments and RGBimage data experiments to test the performance of our TR-WOPT algorithm. We compare our algorithm with some tensorcompletion algorithms which are similar to our algorithm, e.g.,TT-WOPT [24] and CP-WOPT [1]. Moreover, some otherstate-of-the-art algorithms, e.g., FBCP [26] and FaLRTC [16]are also tested in the next experiments.

For the gradient descent method applied in our TR-WOPT,we use the nonlinear conjugate gradient (NCG) with linesearch method, which is implemented by a Matlab toolboxnamed Poblano toolbox [8]. We use two stopping conditionsto all the algorithms: the number of iteration reaches 500 andthe error between two iterations satisfies ‖Y2−Y1‖/‖Y2‖ <tol = 10−6. When one of the stopping conditions is satisfied,the optimization will be stopped.

For completion performance evaluation, we adopt relativesquare error (RSE) and peak signal-to-noise ratio (PSNR).RSE is calculated by:

RSE =‖T real −Y‖F‖T real‖F

, (9)

where T real is the real tensor with full observations, Y is thecompleted tensor. Moreover, for RGB image data, PSNR isobtained by:

PSNR = 10 log10(2552/MSE). (10)

MSE is deduced by:

MSE = ‖T real −Y‖2F /num(T real), (11)

1073

Proceedings, APSIPA Annual Summit and Conference 2018 12-15 November 2018, Hawaii

Page 4: Higher-dimension Tensor Completion via Low-rank Tensor ...

where num(·) denotes the number of element of the tensor. Inaddition, for tensor experiments of random missing cases, wedefine missing rate as 1 −M/num(T real), where M is thenumber of sampled entries (i.e. observed entries).

A. Synthetic data

In this section, we conduct synthetic data experiments to seethe performance of our algorithm and the compared algorithmsunder different tensor dimensions and different missing rates.TR-WOPT, TT-WOPT, CP-WOPT, FBCP, and FaLRTC aretested in the experiments. The synthetic data is generatedby a highly oscillating function: f(x) = sin(π/4)cos(x2) invector form, then the synthetic data is reshaped to generate therequired tensors of different dimensions. We test four differenttensor dimensions: 48×48×48 (3D), 16×16×16×16 (4D),10× 10× 10× 10× 10 (5D) and 7× 7× 7× 7× 7× 7 (6D).The missing rate varies from 0.1 to 0.95. For the rank selectionof TR-WOPT, TT-WOPT, CP-WOPT, we adjust the ranks ofeach algorithm to make the model parameters to be as closeas possible, thus provide a fair situation to compare modelrepresentation ability. FBCP can tune CP rank automatically,and FaLRTC does not need to set tensor ranks beforehand, thuswe only need to tune hyper-parameters of the two algorithmsand record the best performance. The five algorithms are testedand the RSE results are shown in Figure 2.

From the figure we can see, TR-WOPT performs well in allthe four tensor dimensions. Moreover, when the tensor dimen-sion increases, many compared algorithms show performancedegradation. TT-WOPT is the closest algorithm to TR-WOPT,though the number of model parameters is set similarly, TR-WOPT performs better than TT-WOPT almost in all situationsdue to higher representation ability of the TR-based model.

0 0.2 0.4 0.6 0.8 1missing rate

10-5

10-4

10-3

10-2

10-1

100

RS

E

order-three tensor

TR-WOPTTT-WOPTCP-WOPTFBCPFaLRTC

0 0.2 0.4 0.6 0.8 1missing rate

10-4

10-3

10-2

10-1

100

RS

E

order-four tensor

0 0.2 0.4 0.6 0.8 1missing rate

10-3

10-2

10-1

100

RS

E

order-five tensor

0 0.2 0.4 0.6 0.8 1missing rate

10-4

10-3

10-2

10-1

100

RS

E

order-six tensor

Figure 2. Synthetic data completion results of five algorithms under differenttensor dimensions and different missing rates.

RSE=0.0974

RSE=0.9493RSE=0.9940RSE=0.1320RSE=0.1667

RSE=0.1308

RSE=0.0739

RSE=0.1694

RSE=0.0752RSE=0.0855RSE=0.1203

RSE=0.1496

3D

5D

9D

rank=12 rank=48rank=36rank=24

Figure 3. Completion results of TR-WOPT under different tensor dimensionsand different TR-ranks using benchmark image “Lena” (missing rate is 0.7).

B. Image experiments

The synthetic data experiments results show that our TR-WOPT performs well in higher-dimension tensors, so in thissection, we first test the performance of TR-WOPT underdifferent tensor dimension and TR-ranks. Instead of directlyreshaping image to higher-dimension, we apply a novel ten-sorization scheme mentioned in [24]. The idea is simple, if thesize of a RGB image is U×V ×3, and U = u1×u2×· · ·×uland V = v1 × v2 × · · · × vl are satisfied, then the imagecan be tensorized to a (l + 1)-dimension tensor of sizeu1v1 × u2v2 × · · · × ulvl × 3. The first dimension of thehigher-dimension tensor stands for a pixel block of the image,and the following dimensions are the extension blocks of theimage. The higher-dimension tensor generated by the abovetensorization scheme is considered to be a better structure ofthe image.

We choose the benchmark RGB image “Lena” (original size256× 256× 3) in this experiment and tensorize the image bythe above tensorization scheme to 3-D (256 × 256 × 3), 5-D(16 × 16 × 16 × 16 × 3) and 9-D (4 × 4 × 4 × 4 × 4 × 4 ×4 × 4 × 3). For simplicity, the TR-ranks of each experimentare set as the same value, i.e., R1 = R2 = . . . = RN . Weconduct the experiments when TR-ranks are 12, 24, 36 and48 respectively. The missing rates of all the data are set as0.7. Figure 3 shows the visual results and corresponding RSEvalues of each situation. From the results we can see, the bestcompletion performance is obtained at the 5-D case when TR-rank is set as 48. So tensorizing data to a higher-dimensionproperly can enhance the performance of our algorithm. Inthe following RGB image experiments, we adopt this tensorstructure for our algorithm.

The following experiments consider four different irregularmissing situations, i.e., removing images by the shapes of thealphabet, missing by scratching, the block missing and the line

1074

Proceedings, APSIPA Annual Summit and Conference 2018 12-15 November 2018, Hawaii

Page 5: Higher-dimension Tensor Completion via Low-rank Tensor ...

missing. Moreover, random missing cases with high missingrates are also considered. Because all the tensor completionalgorithms perform well in low missing rate situations, we onlytest high random missing rate situations, i.e., missing rates are0.8, 0.9, 0.95 and 0.99. We tune ranks and hyper-parameters ofeach algorithm and record the best completion results of eachalgorithm. Figure 4 and Table I show the visual and numericalcompletion results of the five algorithms respectively. From theresults we can see, TR-WOPT performs better than TT-WOPT,CP-WOPT, and FBCP in all the tested situations. However,FaLRTC shows slightly better performance than TR-WOPT inthe images of scratch missing and random missing (missingrate is 0.8). This is because the images own distinct low-rank property and the missing rate is relatively low, whichis easy for “rank minimization based” algorithms to catch thelow-rank structures of the tensor. However, when the missingrate of data is higher and most of the information is missing,FaLRTC cannot find the low-rank structure of the data, sothe performance of random missing rate 0.9 to 0.99 of thealgorithm drops quickly.

Figure 4. Visual completion results of five algorithms under eight imagemissing situations. The first column and the second column are originalimages and images with specified missing patterns respectively. The followingcolumns are the completion results of the five algorithms respectively. Thefirst row to the fourth row are the completion results of alphabet missing,scratch missing, block missing and line missing respectively. The fifth row tothe last row are random missing completion results of missing rates 0.8, 0.9,0.95 and 0.99 respectively.

Table INUMERICAL COMPLETION RESULTS OF FIVE ALGORITHMS UNDER EIGHT

IMAGE MISSING SITUATIONS.

TR-WOPT TT-WOPT CP-WOPT FBCP FaLRTC

Alphabet RSEPSNR

0.022737.17

0.028235.30

0.090125.62

0.039732.32

0.031334.40

Scratch RSEPSNR

0.11024.55

0.11923.83

0.23118.08

0.11424.20

0.10624.84

Block RSEPSNR

0.089126.21

0.12423.31

0.17620.32

0.11524.01

0.10424.84

Line RSEPSNR

0.10124.81

0.11523.70

0.18719.46

0.11623.61

0.11224.72

0.8 RSEPSNR

0.12823.59

0.14222.71

0.33215.32

0.10125.70

0.083927.27

0.9 RSEPSNR

0.12519.97

0.13419.35

0.4149.562

0.17517.01

0.14618.62

0.95 RSEPSNR

0.25218.20

0.27617.42

0.53011.74

0.35115.32

0.34315.52

0.99 RSEPSNR

0.39812.59

0.6578.226

0.6797.948

0.45711.38

0.9425.099

V. CONCLUSIONS

Based on low-rank tensor ring decomposition, in this paper,we proposed a new tensor completion algorithm named tensor-ring weighted optimization (TR-WOPT). The TR core tensorsare optimized by the gradient-based method and used topredict the missing entries of the incomplete tensors. Weconduct various synthetic data experiments and real-world dataexperiments, and the results show that TR-WOPT outperformsthe state-of-the-art algorithms in many situations. In addition,we also find that tensorizing lower-dimension tensor to aproper higher-dimension tensor can give a better data structureand thus can improve the performance of our algorithm. Goodperformance of TR-WOPT in various completion tasks showsthe high representation ability and flexibility of TR decom-position. It is also shown that the gradient-based algorithmis promising to optimize tensor decompositions. Furthermore,our method needs the TR-rank to be specified before theexperiment, which is time-consuming to find the best TR-rank for the data. In our future work, we will study how todetermine TR-ranks automatically.

ACKNOWLEDGMENT

This work was supported by JSPS KAKENHI (Grant No.17K00326, 15H04002, 18K04178), JST CREST (Grant No.JPMJCR1784) and the National Natural Science Foundationof China (Grant No. 61773129).

REFERENCES

[1] Evrim Acar, Daniel M Dunlavy, Tamara G Kolda, and Morten Mørup.Scalable tensor factorizations for incomplete data. Chemometrics andIntelligent Laboratory Systems, 106(1):41–56, 2011.

[2] Animashree Anandkumar, Rong Ge, Daniel Hsu, Sham M Kakade, andMatus Telgarsky. Tensor decompositions for learning latent variablemodels. The Journal of Machine Learning Research, 15(1):2773–2832,2014.

[3] Rasmus Bro. Parafac. tutorial and applications. Chemometrics andintelligent laboratory systems, 38(2):149–171, 1997.

[4] Xingwei Cao and Qibin Zhao. Tensorizing generative adversarial nets.arXiv preprint arXiv:1710.10772, 2017.

1075

Proceedings, APSIPA Annual Summit and Conference 2018 12-15 November 2018, Hawaii

Page 6: Higher-dimension Tensor Completion via Low-rank Tensor ...

[5] Andrzej Cichocki, Namgil Lee, Ivan Oseledets, Anh-Huy Phan, QibinZhao, Danilo P Mandic, et al. Tensor networks for dimensionality reduc-tion and large-scale optimization: Part 1 low-rank tensor decompositions.Foundations and Trends R© in Machine Learning, 9(4-5):249–429, 2016.

[6] Andrzej Cichocki, Danilo Mandic, Lieven De Lathauwer, Guoxu Zhou,Qibin Zhao, Cesar Caiafa, and Huy Anh Phan. Tensor decompositionsfor signal processing applications: From two-way to multiway compo-nent analysis. IEEE Signal Processing Magazine, 32(2):145–163, 2015.

[7] Andrzej Cichocki, Anh-Huy Phan, Qibin Zhao, Namgil Lee, IvanOseledets, Masashi Sugiyama, Danilo P Mandic, et al. Tensor networksfor dimensionality reduction and large-scale optimization: Part 2 appli-cations and future perspectives. Foundations and Trends R© in MachineLearning, 9(6):431–673, 2017.

[8] Daniel M Dunlavy, Tamara G Kolda, and Evrim Acar. Poblano v1. 0: Amatlab toolbox for gradient-based optimization. Sandia National Labo-ratories, Albuquerque, NM and Livermore, CA, Tech. Rep. SAND2010-1422, 2010.

[9] Marko Filipovic and Ante Jukic. Tucker factorization with missing datawith application to low- n n-rank tensor completion. Multidimensionalsystems and signal processing, 26(3):677–692, 2015.

[10] Silvia Gandy, Benjamin Recht, and Isao Yamada. Tensor completion andlow-n-rank tensor recovery via convex optimization. Inverse Problems,27(2):025010, 2011.

[11] Christopher J Hillar and Lek-Heng Lim. Most tensor problems are np-hard. Journal of the ACM (JACM), 60(6):45, 2013.

[12] Yao Hu, Debing Zhang, Jieping Ye, Xuelong Li, and Xiaofei He. Fastand accurate matrix completion via truncated nuclear norm regulariza-tion. IEEE transactions on pattern analysis and machine intelligence,35(9):2117–2130, 2013.

[13] Alexandros Karatzoglou, Xavier Amatriain, Linas Baltrunas, and NuriaOliver. Multiverse recommendation: n-dimensional tensor factorizationfor context-aware collaborative filtering. In Proceedings of the fourthACM conference on Recommender systems, pages 79–86. ACM, 2010.

[14] Tamara G Kolda and Brett W Bader. Tensor decompositions andapplications. SIAM review, 51(3):455–500, 2009.

[15] Ji Liu, Przemyslaw Musialski, Peter Wonka, and Jieping Ye. Tensorcompletion for estimating missing values in visual data. IEEE trans-actions on pattern analysis and machine intelligence, 35(1):208–220,2013.

[16] Ye Liu, Mingfen Li, Hao Zhang, Hang Wang, Junhua Li, Jie Jia,Yi Wu, and Liqing Zhang. A tensor-based scheme for stroke patients?motor imagery eeg analysis in bci-fes rehabilitation training. Journal ofneuroscience methods, 222:238–249, 2014.

[17] Zhen Long, Yipeng Liu, Longxi Chen, and Ce Zhu. Low rank tensorcompletion for multiway visual data. arXiv preprint arXiv:1805.03967,2018.

[18] Linyuan Lu and Tao Zhou. Link prediction in complex networks:A survey. Physica A: statistical mechanics and its applications,390(6):1150–1170, 2011.

[19] Ivan V Oseledets. Tensor-train decomposition. SIAM Journal onScientific Computing, 33(5):2295–2317, 2011.

[20] Amnon Shashua and Tamir Hazan. Non-negative tensor factorizationwith applications to statistics and computer vision. In Proceedings ofthe 22nd international conference on Machine learning, pages 792–799.ACM, 2005.

[21] Nicholas D Sidiropoulos, Lieven De Lathauwer, Xiao Fu, Kejun Huang,Evangelos E Papalexakis, and Christos Faloutsos. Tensor decompositionfor signal processing and machine learning. IEEE Transactions on SignalProcessing, 65(13):3551–3582, 2017.

[22] Ledyard R Tucker. Some mathematical notes on three-mode factoranalysis. Psychometrika, 31(3):279–311, 1966.

[23] Wenqi Wang, Yifan Sun, Brian Eriksson, Wenlin Wang, and VaneetAggarwal. Wide compression: Tensor ring nets. arXiv preprintarXiv:1802.09052, 2018.

[24] Longhao Yuan, Qibin Zhao, and Jianting Cao. Completion of highorder tensor data with missing entries via tensor-train decomposition.In International Conference on Neural Information Processing, pages222–229. Springer, 2017.

[25] Qibin Zhao, Masashi Sugiyama, and Andrzej Cichocki. Learningefficient tensor representations with ring structure networks. arXivpreprint arXiv:1705.08286, 2017.

[26] Qibin Zhao, Liqing Zhang, and Andrzej Cichocki. Bayesian cp factor-ization of incomplete tensors with automatic rank determination. IEEE

transactions on pattern analysis and machine intelligence, 37(9):1751–1763, 2015.

[27] Qibin Zhao, Guoxu Zhou, Shengli Xie, Liqing Zhang, and AndrzejCichocki. Tensor ring decomposition. arXiv preprint arXiv:1606.05535,2016.

[28] Qibin Zhao, Guoxu Zhou, Liqing Zhang, Andrzej Cichocki, and Shun-Ichi Amari. Bayesian robust tensor factorization for incomplete multi-way data. IEEE transactions on neural networks and learning systems,27(4):736–748, 2016.

1076

Proceedings, APSIPA Annual Summit and Conference 2018 12-15 November 2018, Hawaii