Pytorch Sparse Model Compression
This package provides several functions related to sparse weight compression and size evaluation for pytorch models.
Installation
pip install model-compression-777 -i https://pypi.org/simple
Usage
Important note: to use this, you must first prune your model, for which the methods vary from model to model. Model compression is only efficient if the weights are very sparse.
All functions contain docstrings. They are listed here for convenience (along with some notes).
Model Loading
load_pruned(path):
load a pruned pytorch state file by applying weight mask.
returns a dict where the keys are the array names (e.g. encoder.0.2.bias)
load_unpruned(path):
loads pytorch state file into a dict.
Compression / Decompression
to_relative_csr(m, index_bits):
converts m into the column-relative CSR format.
m must be a 1D or 2D NUMPY array; use .numpy() on pytorch tensors first.
index_bits is the bit width of relative column spacing; try around 2~8.
returns (nonzero values (v), column offsets (c), row indices (r)).
from_relative_csr(v, c, r, width):
utility function that converts CSR format back into normal format.
the purpose of this function is mostly for testing;
note that using this for sparse matrix operations can be very inefficient.
compress(vec, data_bits=4, fc_idx_bits=4, conv_idx_bits=5, def_idx_bits=4, row_bits=32):
compresses common weights.
for convolution, when kernel is size 1, it is seen as a fully-connected layer.
returns a tuple containing compressed format and size of the compressed weight in bytes.
Model Size Evaluation
get_csr_size_in_bytes(v, c, r, v_width, c_width, r_width):
returns sum of array sizes, where each array element corresponds to the size input (x_width) in bits
print_weight_info(weights_normalized):
prints some info about a weights dict. for each weight matrix, this prints its min,
max, total number of elements, and sparsity.
returns global sparsity.
Example
First, we load the weights dictionary from a state file.
weights = load_pruned("outputs/exp_dset=verso_ssd,prune_preset=verso2/best.th")
We get some basic info about these weights.
print_weight_info(weights)
encoder.0.0.bias -0.1661 0.3005 48
encoder.0.0.weight -0.4365 0.3742 384
52.3438%
encoder.0.2.bias -0.4893 0.8613 96
encoder.0.2.weight -0.7753 1.5334 4608
23.7413%
encoder.1.0.bias -0.0760 0.2138 96
encoder.1.0.weight -0.6905 0.7302 36864
7.7474%
encoder.1.2.bias -0.4025 1.1158 192
encoder.1.2.weight -0.9252 0.9893 18432
15.1204%
encoder.2.0.bias -0.1144 0.2732 192
encoder.2.0.weight -0.6058 0.6952 147456
8.4371%
encoder.2.2.bias -0.3144 1.1645 384
encoder.2.2.weight -0.7645 0.8168 73728
12.1487%
encoder.3.0.bias -0.2279 0.3234 384
encoder.3.0.weight -0.7191 0.6821 589824
9.6261%
encoder.3.2.bias -0.3796 0.6909 768
encoder.3.2.weight -0.9781 0.9608 294912
14.5667%
encoder.4.0.bias -0.2516 0.1453 768
encoder.4.0.weight -0.6874 0.8081 2359296
10.0274%
encoder.4.2.bias -0.2767 0.3749 1536
encoder.4.2.weight -1.0705 0.9233 1179648
14.0047%
decoder.0.0.bias -0.4838 0.4492 1536
decoder.0.0.weight -1.0980 1.1172 1179648
12.3450%
decoder.0.2.bias -0.5981 0.9222 384
decoder.0.2.weight -0.9818 0.7783 2359296
6.3385%
decoder.1.0.bias -0.7151 0.4123 768
decoder.1.0.weight -1.4589 1.4236 294912
9.6459%
decoder.1.2.bias -0.4267 0.8702 192
decoder.1.2.weight -1.0842 1.5031 589824
5.3640%
decoder.2.0.bias -1.0682 0.7818 384
decoder.2.0.weight -1.3639 1.6044 73728
11.8056%
decoder.2.2.bias -0.0633 0.5356 96
decoder.2.2.weight -1.9591 1.6501 147456
6.6155%
decoder.3.0.bias -1.1293 0.3498 192
decoder.3.0.weight -1.3212 1.9597 18432
18.1532%
decoder.3.2.bias -0.0459 0.1919 48
decoder.3.2.weight -0.8982 0.6510 36864
9.1092%
decoder.4.0.bias -0.3222 0.4334 96
decoder.4.0.weight -0.9638 1.0617 4608
24.0885%
decoder.4.2.bias -0.0128 -0.0128 1
decoder.4.2.weight -0.3609 0.3434 384
48.4375%
lstm.lstm.bias_ih_l0 -0.1892 0.5374 3072
lstm.lstm.bias_hh_l0 -0.2168 0.5108 3072
lstm.lstm.bias_ih_l1 -0.1490 0.4970 3072
lstm.lstm.bias_hh_l1 -0.1674 0.4793 3072
lstm.lstm.weight_ih_l0 -0.8711 0.8251 2359296
8.8214%
lstm.lstm.weight_ih_l1 -0.9791 1.0533 2359296
8.8558%
lstm.lstm.weight_hh_l0 -0.7069 0.7139 2359296
7.3889%
lstm.lstm.weight_hh_l1 -0.8010 0.8255 2359296
6.0804%
0.0873541874651943
0.9126458125348057
This is a unet architecture with 5 levels of encoding and decoding, which are conv1d layers. At the center is 2 layers of LSTM. After pruning, the model has a global sparsity of 91.26%, indicating only 8.74% of the values are nonzero. There are approximately 18M parameters.
We then try to compress all of the weights, excluding biases. In the end, the compressed size is printed.
compressed = {}
size = 0
for name, vec in weights.items():
if name.find("activation") > -1: continue
vec = vec.numpy()
c, s = compress(vec, fc_idx_bits=4, conv_idx_bits=5, def_idx_bits=5)
compressed[name] = c
size += s
print("\n%.2f KiB" % (size / 1024,))
2108.52 KiB
This is approximately 1 / 9 of the original model, if the original model were to be stored in 8 bits.
Now, we try to run inference on this set of compressed weights. First, we add some extra functions for the decoder and encoder.
def beco_matmul(At, B):
"""
high level simulation of beco matrix multiply behavior. note that beco
assumes the first matrix is stored in memory transposed, and both matrices
(when stored) need to have a multiple of 4 as its width.
"""
return (At.T @ B)
def convolve(W, m, bias, stride):
"""
efficient convolution algorithm with a batch size B on a NORMAL weight matrix.
this is to 1. be used for testing and
2. demonstrate the underlying principles of sparse convolution.
"""
B = 8
out_ch, K, in_ch = W.shape
in_len = m.shape[0]
out = np.zeros((((in_len - stride) // (K - stride)) if K > stride else in_len // K, out_ch))
for o in range(out_ch):
queue = []
out_idx = 0
for row in range(0, in_len, B):
curr_matrix = m[row : row + B]
c = beco_matmul(curr_matrix.T, W[o].T)
assert(len(c.shape) == 2)
queue.extend(c)
while (len(queue) >= K):
s = sum([queue[k][k] for k in range(K)])
queue = queue[stride:]
out[out_idx][o], out_idx = s + bias[o], out_idx + 1
return out
def convolve_sparse(W, bias, m, in_len, out_ch, in_ch, k, stride):
"""
performs convolution on input matrix, where each row is a channel.
weights W is an array of CSR matrix (v, c, r) pairs, each corresponding to a output channel.
returns an output matrix where each row is a channel, and is thus chainable.
pseudocode: (only for reference, might not completely match code)
C <- zeros(out_ch*out_len)
for each output channel:
W <- weights matrix corr. to this output channel (dim=k*in_ch)
for each batch of L rows in W:
At <- zeros(0*in_len)
B <- zeros(0*L)
for each l of the L rows keep a head pointer p_l, and current column c_l
while we have not exhausted all head pointers:
find minimum col num of all heads
reconstruct column at that index from W
transpose this column and vstack it to matrix B
pick out corr. row in input matrix and vstack to At
increment p_l and c_l for all rows that have the minimum col num
# at this point, should have At=(x*in_len) and B=(x*L), where x is nonzero count
call beco matmul to obtain U_l = At.T @ B (dim=in_len*L)
hstack all U_l's to form matrix U (dim=in_len*k)
for i = 0 to in_len step stride:
s <- sum of U[i, 0] to U[i + k, k]
C[out channel idx][j++] <- s
return C
TODO: support batch sizes that are not the entire length of input
NOTE TO SELF: when transcribing to C, fix & unroll L
"""
L = 4
out_len = ((in_len - stride) // (k - stride)) if k > stride else in_len // k
C = np.zeros((out_ch, out_len))
for o in range(out_ch):
v, c, r = W[o]
U_l = []
for l in range(0, k, L):
At = np.zeros((0, in_len))
B = np.zeros((0, L))
ll = min(k - l, L)
p_l = [r[l + i] for i in range(ll)]
c_l = [c[p_l[i]] - 1 if p_l[i] < r[l + i + 1] else 9999 for i in range(ll)]
while np.any([p_l[i] < r[l + i + 1] for i in range(ll)]):
min_val = 9999
for i in range(ll):
if p_l[i] < r[l + i + 1]:
min_val = min(min_val, c_l[i])
rc = np.zeros((L,))
for i in range(ll):
if (c_l[i] == min_val):
rc[i] = v[p_l[i]]
p_l[i] += 1
if p_l[i] < r[l + i + 1]: c_l[i] += c[p_l[i]]
B = np.vstack([B, rc])
At = np.vstack([At, m[min_val]])
U_l.append(beco_matmul(At, B))
U = np.hstack(U_l)
j = 0
for i in range(0, in_len - k + 1, stride):
s = np.sum([U[i + kk, kk] for kk in range(k)])
C[o][j], j = s + bias[o], j + 1
return C
def fc_sparse(W, bias, m, out_ch, in_len):
"""
calculates fully connected layer on input matrix, where each row is a channel.
weights W is a tuple (v, c, r).
returns an output matrix where each column is a channel, and is thus chainable.
pseudocode:
for each batch of L rows in weight matrix W (dim=out_ch*in_ch):
At <- zeros(0*L)
B <- zeros(0*in_len)
for each l of the L rows keep a head pointer p_l, and current column c_l
while we have not exhausted all head pointers:
find minimum col num of all heads
reconstruct column at that index from the current L-row submatrix of W
transpose this column and vstack it to matrix At
pick out corr. row in input matrix and vstack to B
increment p_l and c_l for all rows that have the minimum col num
# at this point, should have At=(x*L) and B=(x*in_len), where x is nonzero count
call beco matmul to obtain U_l = At.T @ B (dim=L*in_len)
vstack all U_l's to form matrix C (dim=out_ch*in_len)
return C
"""
L = 4
v, c, r = W
U_l = []
for l in range(0, out_ch, L):
At = np.zeros((0, L))
B = np.zeros((0, in_len))
ll = min(out_ch - l, L)
p_l = [r[l + i] for i in range(ll)]
c_l = [c[p_l[i]] - 1 if p_l[i] < r[l + i + 1] else 9999 for i in range(ll)]
while np.any([p_l[i] < r[l + i + 1] for i in range(ll)]):
min_val = 9999
for i in range(ll):
if p_l[i] < r[l + i + 1]:
min_val = min(min_val, c_l[i])
rc = np.zeros((L,))
for i in range(ll):
if (c_l[i] == min_val):
rc[i] = v[p_l[i]]
p_l[i] += 1
if p_l[i] < r[l + i + 1]: c_l[i] += c[p_l[i]]
At = np.vstack([At, rc])
B = np.vstack([B, m[min_val]])
U_l.append(beco_matmul(At, B))
C = np.vstack(U_l) + np.expand_dims(bias, 1)
return C
GLU = lambda x: x[:x.shape[0] // 2] / (1 + np.exp(-x[x.shape[0] // 2:]))
ReLU = lambda x: np.maximum(0, x)
Now we attempt to run the encoding process for an input of length 2560.
layers = [
(compressed["encoder." + str(i) + ".0.weight"], compressed["encoder." + str(i) + ".0.bias"],
compressed["encoder." + str(i) + ".2.weight"], compressed["encoder." + str(i) + ".2.bias"])
for i in range(5)
]
chs = [(1, 48), (48, 96), (96, 192), (192, 384), (384, 768)]
v = np.random.rand(1, 2560)
m = v
for (l1, b1, l2, b2), (ch1, ch2) in zip(layers, chs):
m = convolve_sparse(l1, b1, m, in_len=m.shape[1],
out_ch=ch2, in_ch=ch1, k=8, stride=4)
m = ReLU(m)
m = fc_sparse(l2, b2, m, ch2 * 2, m.shape[1])
m = GLU(m)
print(m)
[[ 2.16402284e-05]
[-6.86438298e-01]
[ 7.45622957e-04]
[ 1.01900820e-01]
[ 1.54519461e+00]
[-1.25690016e-03]
[-1.94400923e-01]
[ 3.04551532e-01]
[-1.82606233e+00]
[-1.41211974e-02]
[ 1.53868863e-02]
[ 1.29516874e-02]
[-1.50087096e-04]
[ 1.49187872e-01]
[-3.98359375e-04]
[-1.05189119e-03]
[ 4.18792611e-06]
[ 2.80498345e-03]
[-2.91582238e-04]
[-2.22297634e+00]
[ 8.78154058e-04]
[ 2.28849150e-03]
[-1.13989553e-01]
[-1.64408021e-04]
[ 4.07686757e-06]
[ 2.78668457e-05]
[-2.11741295e-02]
[-3.63812103e+00]
[ 2.82300887e-02]
[-1.02316345e-04]
[-6.79898742e+00]
[ 3.79131864e-02]
[ 1.72463322e-04]
[ 1.19783254e-02]
[-8.29995804e-01]
[ 5.08629907e-02]
[ 2.59044820e-04]
[-1.62236813e-02]
[ 5.15336508e+00]
[ 4.12606848e-01]
[ 6.84838350e+00]
[-4.60875202e-02]
[ 1.33798626e-06]
[ 6.69906784e-04]
[-1.61797997e-02]
[-5.05738175e-05]
[ 1.18792637e-02]
[ 2.01824466e-08]
[-2.02370504e-02]
[ 8.79980400e-05]
[-2.79446803e-03]
[ 4.29067547e-03]
[ 9.52100311e-06]
[-3.99108471e-01]
[ 5.08875769e-03]
[ 3.10131139e-03]
[ 1.00726565e+00]
[ 9.37067682e-03]
[-1.82911806e-04]
[-1.51223576e-04]
[-4.77106715e-03]
[ 5.57939170e-05]
[-1.26068642e+00]
[-4.72148217e-02]
[ 5.22163521e-02]
[ 1.12867614e+01]
[-5.06816381e-01]
[-6.20266107e-03]
[ 1.56053440e-04]
[ 1.78338047e-01]
[-1.06517999e-01]
[ 3.33904488e+00]
[ 3.25012307e-04]
[-5.29629197e-01]
[ 1.25449269e-02]
[-8.83551951e-04]
[ 1.54179887e-02]
[-5.81193718e-01]
[ 1.96658746e-05]
[ 1.75131692e-02]
[ 2.26822252e+00]
[ 2.14782461e-04]
[-4.18358514e-02]
[-1.58986807e-03]
[ 1.51929710e-04]
[ 5.18822202e-02]
[ 1.19674115e-03]
[ 1.55362947e-04]
[ 3.29111685e+00]
[ 1.69947926e-04]
[-8.83893177e-01]
[-1.52082086e-01]
[ 3.66975067e-01]
[ 4.57957368e-06]
[-9.59310654e+00]
[-2.22869251e-02]
[-1.77374137e-02]
[-2.87703342e-04]
[-3.81423295e-03]
[ 1.65667095e-01]
[ 2.63715392e+00]
[ 3.63110961e-04]
[ 7.71832435e+00]
[-1.67599192e-03]
[-1.60511397e-04]
[ 6.53566806e-01]
[ 2.07551465e-01]
[-4.38386192e-02]
[-7.98142110e-01]
[ 7.43911643e-03]
[ 3.03291535e-02]
[ 1.05395086e-05]
[-1.36504051e-03]
[-1.44864165e+00]
[ 6.43612086e-01]
[-5.11755452e-03]
[ 1.34076761e-05]
[-4.80616965e-05]
[-1.65409687e-03]
[-1.22409953e-01]
[ 3.79309625e-07]
[ 4.90687141e-01]
[ 4.17284066e-02]
[ 1.37489995e-05]
[-5.32538015e-05]
[ 1.46352930e-03]
[ 1.16743909e-01]
[ 1.12893540e-05]
[ 5.80462330e-05]
[-1.49926878e+00]
[ 3.26890142e-03]
[-6.13356936e-02]
[-3.06065654e+00]
[-1.53681671e-04]
[ 1.53803482e-01]
[-2.14382184e-05]
[-4.82365244e-01]
[-1.40984863e-02]
[-4.33194789e-01]
[ 1.27913601e-01]
[-2.53693934e-04]
[ 3.53165355e-04]
[-2.16453835e-02]
[-1.27677791e-03]
[ 1.46852580e-01]
[-6.30717622e-01]
[-3.50185824e-02]
[-4.35849041e-05]
[ 1.01112814e-02]
[ 1.18353492e-02]
[ 2.02767793e-03]
[ 2.80404036e-02]
[-1.53773016e+00]
[ 9.11765761e-04]
[-1.30189308e-02]
[-2.07427172e-03]
[ 1.64920323e-05]
[-2.33390860e-01]
[-2.23327682e-01]
[ 6.70775256e-03]
[ 1.45987808e+00]
[ 3.66868477e-04]
[-2.04600061e-03]
[ 2.14193515e-04]
[-3.58027862e-01]
[-6.01612444e-03]
[ 3.61380677e-03]
[ 2.50839046e+00]
[ 5.04686941e-06]
[-1.25481575e-01]
[ 1.27662593e-01]
[ 2.12404785e-04]
[ 1.63952821e+00]
[-1.10040230e-02]
[ 3.17762399e-04]
[ 9.73954321e-04]
[ 6.28515455e-06]
[ 7.98612850e-03]
[ 1.49517229e+00]
[-1.57429942e+00]
[-9.13901410e-06]
[ 2.71168130e-03]
[ 1.94714109e-03]
[-9.87627215e-04]
[-2.40376381e+00]
[ 4.64995931e-03]
[-1.04441135e-01]
[ 4.10489906e+00]
[-4.99242858e-03]
[-3.61056066e+00]
[-2.63041184e-03]
[-5.48211854e-01]
[ 1.14669233e-04]
[-1.89198456e-05]
[-3.03931360e-01]
[-6.10429951e-04]
[-2.15477387e+01]
[-4.62507162e-06]
[ 5.88701186e-01]
[-5.83893290e-01]
[-1.07007164e-01]
[-1.40714500e-04]
[-1.98141217e-05]
[-3.09879236e+00]
[-3.55130663e-02]
[ 1.74155030e+00]
[ 2.87856661e-02]
[ 2.13640412e-05]
[-4.04480756e-03]
[-1.97696281e-01]
[ 5.39750972e-02]
[ 7.76877425e-01]
[-2.10957728e-02]
[-2.77669274e-01]
[ 2.57012836e-07]
[ 2.81173993e+00]
[-4.28314976e-02]
[ 7.65993752e-03]
[ 2.03518215e-02]
[-3.24292221e-01]
[-2.45970421e-01]
[ 2.56945635e-01]
[-2.62702870e-06]
[-1.12670145e-05]
[ 5.15164221e-01]
[-4.49232940e-02]
[ 5.59408103e-02]
[ 9.71112813e-03]
[-5.17330042e-06]
[ 5.48824564e-01]
[-5.83347712e-03]
[ 1.10468682e-05]
[ 3.17283761e+00]
[ 7.43085723e-03]
[ 9.36713378e-04]
[ 1.29352082e-01]
[ 3.23289907e-02]
[ 3.21018030e-03]
[-2.39518585e-05]
[ 4.37457536e-03]
[-2.43856936e+00]
[-3.26500881e-04]
[ 3.30745847e-01]
[ 2.63392728e-02]
[ 6.21687231e-02]
[ 3.47899105e-03]
[ 8.85809543e-04]
[ 8.68417310e-05]
[ 1.98765642e-08]
[ 3.33584593e-01]
[-1.51657715e-02]
[ 2.18688681e+00]
[-3.38205822e-06]
[-2.25584433e+00]
[-8.76422957e-04]
[-1.29229938e-02]
[ 3.76360192e-02]
[ 5.28354629e-03]
[-8.13676880e-02]
[-8.18348349e-01]
[-4.10753391e-04]
[ 1.08430247e-02]
[-6.49048872e-02]
[-1.24394698e-03]
[ 1.87675323e-03]
[ 1.02566077e-02]
[ 5.93111708e-03]
[-8.38771727e-03]
[-1.05752448e-03]
[ 3.24148280e-01]
[ 1.34762069e-02]
[ 8.15255324e-03]
[ 2.54803680e-05]
[ 1.38097048e-02]
[-1.72074164e+00]
[-8.36883045e-01]
[-4.47816041e-02]
[ 1.06838153e+00]
[ 1.28585848e-03]
[-5.31533068e-01]
[-4.35341995e+00]
[ 1.24055956e-05]
[ 2.49220111e+00]
[ 1.15555923e-02]
[-2.81051299e-03]
[ 4.12665909e-05]
[ 9.51403746e-03]
[-2.83230931e-02]
[ 1.33184211e-05]
[-2.86244271e-04]
[ 1.36530411e-04]
[-2.55850061e-04]
[ 8.76205645e-01]
[-6.01374047e-04]
[ 1.43217177e-02]
[-6.58725466e+00]
[-2.31478624e-02]
[ 1.73567857e-04]
[-5.95972627e-03]
[-1.95804004e+00]
[-9.86885318e-06]
[ 7.57890266e-03]
[ 6.46906533e-05]
[-3.42259903e+00]
[-1.58723229e-05]
[ 8.71496244e-01]
[ 4.76379254e+00]
[-8.83978479e+00]
[ 5.63618172e-01]
[ 2.16942486e-08]
[-1.98828644e+00]
[ 8.03220723e-02]
[ 3.85485274e-02]
[-2.30985211e+00]
[-1.35328315e-01]
[ 2.32032338e+00]
[-5.36839891e-05]
[-3.55609206e-01]
[ 1.59054503e-04]
[ 2.34945754e-02]
[-1.63097823e-04]
[-1.70305860e-01]
[ 2.14476625e+00]
[-2.72906820e-01]
[ 4.73794395e-04]
[-2.38820088e+00]
[ 8.29460027e-04]
[-2.10438022e-03]
[ 2.11044907e+00]
[-2.30018846e-03]
[ 5.22365519e-02]
[-3.38349814e-06]
[ 1.49023911e-05]
[-4.95146055e-06]
[ 5.25239860e-02]
[ 1.24732147e-02]
[-7.20640349e-01]
[-9.79490928e-06]
[-5.34025421e-02]
[-3.56543894e-04]
[ 2.29124823e-02]
[-4.02883323e-03]
[-4.37127542e-01]
[-6.37828658e+00]
[-5.22019352e-04]
[ 1.69487361e+00]
[ 7.12729657e-05]
[ 6.74935189e+00]
[ 4.43008186e-05]
[ 2.32269597e-03]
[ 7.90177177e-02]
[ 6.80347327e-03]
[ 1.01023896e-03]
[ 1.26457449e+01]
[ 1.05549123e-02]
[-6.48070639e-04]
[-2.05178642e-03]
[ 6.25396432e-02]
[ 2.78956044e+00]
[-4.42556996e-03]
[ 1.02382159e-02]
[-1.46718178e+00]
[ 1.03863596e+00]
[ 3.34926637e-05]
[-3.97308128e-01]
[-8.66817343e-05]
[-3.34434961e+00]
[ 5.46221337e-03]
[ 5.42731240e-01]
[ 7.78845243e-05]
[-1.32001241e-01]
[ 1.29653925e-02]
[ 1.49284197e-02]
[ 6.67046778e-06]
[ 6.85503369e-04]
[-1.00749703e-03]
[-2.42800748e-05]
[ 1.29648268e-03]
[-1.82440994e-01]
[-1.85046552e-01]
[-5.37561100e+00]
[-1.53297403e-03]
[-2.98212563e-06]
[-4.66388868e+00]
[ 4.76966451e-05]
[ 1.53903551e-02]
[ 3.05159487e-05]
[-8.30903950e-03]
[ 2.82507456e-01]
[ 2.89732887e-05]
[ 3.99109901e-07]
[ 3.79707164e-02]
[ 1.03957938e-02]
[-4.02986185e-03]
[-1.62139128e-05]
[-3.54332009e-01]
[ 1.25108323e-01]
[ 1.61559159e+00]
[ 1.57816067e-01]
[-4.50244465e-02]
[-4.57385382e-03]
[-1.30833268e-02]
[-2.85271435e-02]
[-3.27352052e-02]
[-5.17508871e+00]
[-1.10060127e+00]
[ 1.12430296e+00]
[ 5.75429713e-04]
[-1.18500575e-03]
[-4.55020859e-03]
[-1.02720009e-02]
[ 1.10236830e-01]
[-1.80110854e+00]
[-1.12503709e-01]
[-7.56388550e-02]
[-8.25224739e-05]
[-1.44559755e-03]
[ 5.14813812e-04]
[ 1.69238616e-02]
[ 2.64495774e-01]
[ 1.11180950e-01]
[ 6.39388406e-02]
[-8.59925363e-02]
[ 7.68852429e-01]
[ 2.75934275e-01]
[-1.97347612e-01]
[-2.08485012e-06]
[ 4.22312268e+00]
[ 7.94325783e-02]
[ 8.72165732e-01]
[-1.63718567e-01]
[ 2.48457946e-03]
[-2.15470454e+00]
[-3.68127849e-02]
[-2.52637991e+00]
[ 2.09757629e-02]
[ 1.02027226e-04]
[-1.50091439e-03]
[-1.29389424e-02]
[-2.67227767e-01]
[ 2.63637832e+00]
[-7.31887707e-01]
[-3.03768560e+00]
[-5.80697128e-04]
[-3.61200157e-07]
[-2.33214718e-04]
[ 1.55106408e+00]
[-9.21439011e-02]
[-1.79346098e-05]
[ 1.11918666e-01]
[ 2.24778525e+00]
[ 4.18409832e-02]
[-2.25382824e+00]
[-1.54887528e+00]
[-6.78341407e-04]
[ 5.83745769e-02]
[ 5.86225233e-03]
[ 8.99413367e-04]
[-8.02031847e+00]
[-1.53291872e+01]
[ 3.56382478e-03]
[ 6.69867793e+00]
[ 1.23705054e-02]
[ 2.06633155e+00]
[ 1.48958460e-03]
[-1.43937794e-04]
[ 4.67824039e-04]
[-9.11695625e-03]
[-1.54350227e-06]
[-2.56881447e-02]
[ 8.14284969e+00]
[ 2.18820246e-02]
[-7.76452228e-02]
[-1.08602075e-02]
[ 5.23113691e-01]
[ 2.44219891e-02]
[-8.91727923e-03]
[-6.33188523e-05]
[-1.14842782e-02]
[-1.12255418e-01]
[-1.27707531e-02]
[ 1.23082565e-03]
[ 2.74751232e-03]
[ 1.94464145e-01]
[-3.44148961e-02]
[ 6.22754584e-04]
[-1.94739133e-01]
[ 5.43782656e-02]
[ 2.49569553e-02]
[ 1.93681750e-02]
[ 3.59606489e+00]
[ 4.96651031e-02]
[-2.24570684e-05]
[-2.50065693e-02]
[-2.79840425e-01]
[-6.23010870e-03]
[ 7.68045699e-06]
[-7.75992917e+00]
[ 5.78911393e-03]
[-6.25941646e-02]
[ 4.95852813e-03]
[ 8.46620888e-02]
[ 2.11255753e-02]
[-6.22450822e-02]
[-7.32734387e-01]
[-1.19034059e-02]
[ 6.92300801e-04]
[-4.44862081e-05]
[ 3.83526908e-05]
[ 6.30628623e-01]
[ 2.82474532e-06]
[ 8.63418060e-01]
[-2.72178140e-02]
[ 1.59969082e+00]
[-2.12562751e-02]
[ 4.46086478e-03]
[ 3.40515047e-01]
[ 6.67364022e-01]
[ 8.08496393e-01]
[-1.94834705e-02]
[ 1.82421822e+00]
[ 9.51784218e-01]
[ 6.26745690e-02]
[ 8.33672886e-02]
[ 4.84093395e-01]
[ 1.16867855e+00]
[ 3.40521822e-07]
[ 1.57876155e+01]
[ 2.26083780e-02]
[-5.57226037e-05]
[ 3.85055840e-01]
[-7.79319962e-01]
[-1.21512644e-02]
[ 6.32106516e-02]
[ 1.05585077e-02]
[-4.60781958e-02]
[-6.34508151e-04]
[ 8.17822022e-05]
[ 1.76402543e-02]
[-2.49037957e-04]
[ 2.23480005e-01]
[-1.38142404e-02]
[-4.26721444e-03]
[ 2.86682621e-04]
[ 5.12146955e-02]
[ 1.04628199e-02]
[-1.86784078e-01]
[ 6.05997015e+00]
[ 3.40352256e-02]
[-1.67166501e-01]
[-8.20160180e-04]
[ 6.70320325e+00]
[-3.29330268e-01]
[ 1.18532396e-05]
[ 1.64989498e-02]
[-4.12652258e-01]
[-1.86542621e+00]
[ 6.01979791e-04]
[ 1.30924180e-02]
[ 7.38263060e-01]
[-3.83442494e-01]
[-1.19704748e-02]
[ 7.87062136e-06]
[ 1.37263199e+00]
[ 2.55764462e-02]
[-2.80734816e-04]
[ 1.28136473e-04]
[-7.01014134e+00]
[-4.03710059e-05]
[ 1.82764621e-04]
[-6.66891097e-02]
[-3.38198550e-02]
[ 1.79744168e-03]
[-7.24268728e-01]
[ 3.80762575e+00]
[ 6.38201008e-03]
[-3.16073522e-02]
[-1.55421837e-01]
[-1.74203425e-03]
[-2.61642971e-04]
[ 3.32801303e-05]
[ 2.65677574e-02]
[-7.76727307e-02]
[-8.11328163e+00]
[ 1.25477524e-07]
[-2.88973170e-01]
[ 6.17018147e+00]
[-1.43966521e+00]
[-2.39826458e-01]
[-3.08620171e-05]
[-2.68713286e-06]
[ 3.26780808e-02]
[ 2.53362085e-05]
[ 2.60506755e-01]
[-6.19602111e-02]
[-7.26616055e-02]
[ 1.35499095e-07]
[ 4.92923087e-07]
[ 6.00794271e-01]
[ 3.01588845e-02]
[-3.98670591e+00]
[-1.89174825e-02]
[ 2.92480303e-06]
[ 3.00879456e+00]
[ 5.58964023e-03]
[-3.32947956e+00]
[ 3.35868355e-04]
[-1.47596350e-02]
[ 2.42852575e-04]
[-1.34270688e-01]
[-1.30152640e-01]
[-1.81420537e+00]
[-1.60932700e-07]
[-1.78779174e-03]
[ 3.32851241e-03]
[-1.43737946e-02]
[ 3.71659749e-04]
[ 8.77362297e-03]
[ 2.12546196e-04]
[ 5.09870157e-04]
[ 7.21445325e-02]
[ 8.59226694e-02]
[-1.28442163e-03]
[ 1.79514438e-03]
[ 6.96741979e-01]
[ 1.25852700e+01]
[ 4.02714397e-01]
[ 3.81688011e-02]
[-4.47220980e-02]
[ 2.63155286e-05]
[ 6.59452433e-04]
[-5.12420324e-02]
[-3.75661451e-02]
[ 1.29409290e-02]
[-3.03387537e-05]
[ 1.66052726e-01]
[ 1.94825102e-02]
[ 5.58010388e-06]
[-1.81955542e+00]
[ 5.83099552e-04]
[-8.46250680e-03]
[ 3.85049947e-02]
[ 8.76672411e-01]
[-6.43916003e-01]
[ 5.94354922e-03]
[ 9.96655199e-06]
[ 3.32186463e-02]
[-2.81125979e+00]
[ 3.15975969e-07]
[-3.76321670e-03]
[-3.89063967e-05]
[ 9.66276667e-07]
[-8.53108067e-07]
[ 2.74082808e-04]
[-3.15949522e-05]
[ 2.13291938e-02]
[ 5.02169020e-02]
[ 9.46894171e-02]
[-1.52640293e-04]
[-2.44322598e-04]
[ 7.84715680e-02]
[ 6.20422606e-06]
[-5.35482420e-03]
[ 4.82598350e-04]
[-8.83662231e-06]
[-3.03128475e-03]
[ 3.71285945e-02]
[-8.60507451e-04]
[-6.12265976e-04]
[ 1.07449467e-04]
[ 1.70862324e+00]
[ 4.26967325e-03]
[-1.20888076e+00]
[ 3.01889056e-04]
[-1.32836273e-03]
[ 1.35067347e-05]
[ 1.86683849e-01]
[-1.35094214e+00]
[-3.80354206e-03]
[ 8.19290040e-03]
[-4.02936602e-03]
[-1.06053049e-02]
[ 1.11052230e-03]
[ 3.06709957e-03]
[-1.23437992e-01]
[ 1.01874422e-05]
[-5.51754374e-02]
[ 3.78053542e+00]
[-8.92780698e-02]
[ 1.84058926e-01]
[ 6.43643155e-02]
[ 3.98752642e-05]
[-1.26305128e-03]
[-1.14271756e-03]
[-5.31691110e-06]
[ 4.59026902e-05]
[ 5.80739647e-01]
[-4.53362721e-01]
[-2.43566872e-03]
[ 3.38749460e-03]
[ 1.08677226e-01]
[ 1.75657191e+00]
[-3.55715167e+00]
[ 3.28646678e-03]
[ 3.36527330e-03]
[ 4.49290737e+00]
[ 2.37329957e-05]
[ 5.08757533e-02]
[ 7.52666019e-03]
[ 1.12785564e-01]
[-9.48445880e-04]
[-2.35378996e-02]
[-1.14762991e+00]
[-2.52950330e-03]
[-8.67083483e-01]
[ 1.64981706e-03]
[-2.70895303e-03]
[ 8.17771941e-03]
[-2.33831900e-05]
[-6.61013128e-03]
[-6.08919324e-02]
[-1.39414402e-02]
[ 8.72160953e-02]
[ 4.57992806e-06]
[-9.60445506e-03]
[-5.95732975e-01]
[-3.74012267e-01]
[-1.57993814e-04]
[-9.72814092e-01]
[ 1.61488581e-01]
[-3.47490973e-05]
[-7.91696089e-03]
[-2.92578995e-01]
[ 2.48032721e-01]
[ 5.75428917e-04]
[ 4.11466894e-03]
[ 1.08222004e-05]
[ 2.28163443e-01]
[-9.68510240e-04]
[ 3.06336326e-04]
[ 6.56900104e-03]
[ 1.60143466e-01]
[ 8.17573226e-02]
[-1.24264693e-02]
[ 4.79215402e-05]
[ 4.97459707e-01]
[ 2.84065397e-05]
[ 7.07777033e-06]
[ 3.03473854e-02]
[-6.82642840e-02]
[-3.58612163e-02]
[-4.18496316e-03]
[-9.68937220e-03]
[-1.09420151e-05]
[-2.79087882e-03]
[-4.09690812e-04]
[-7.49116465e-06]
[-4.45514727e-02]
[ 3.73627802e-01]
[-6.11236035e-04]
[-1.64280611e-05]
[-1.55265541e-02]
[ 7.46388305e+00]
[-7.74431458e+00]
[ 2.00235249e-01]
[-9.25944054e-06]
[-6.98843387e-03]
[ 8.51422328e-03]]