+All Questions (84)
+
+
+
+
+Some courses which have used libsvm as a tool
+Some applications/tools which have used libsvm
+Where can I find documents/videos of libsvm ?
+Where are change log and earlier versions?
+How to cite LIBSVM?
+I would like to use libsvm in my software. Is there any license problem?
+Is there a repository of additional tools based on libsvm?
+On unix machines, I got "error in loading shared libraries" or "cannot open shared object file." What happened ?
+I have modified the source and would like to build the graphic interface "svm-toy" on MS windows. How should I do it ?
+I am an MS windows user but why only one (svm-toy) of those precompiled .exe actually runs ?
+What is the difference between "." and "*" outputed during training?
+Why occasionally the program (including MATLAB or other interfaces) crashes and gives a segmentation fault?
+How to build a dynamic library (.dll file) on MS windows?
+On some systems (e.g., Ubuntu), compiling LIBSVM gives many warning messages. Is this a problem and how to disable the warning message?
+In LIBSVM, why you don't use certain C/C++ library functions to make the code shorter?
+Why sometimes not all attributes of a data appear in the training/model files ?
+What if my data are non-numerical ?
+Why do you consider sparse format ? Will the training of dense data be much slower ?
+Why sometimes the last line of my data is not read by svm-train?
+Is there a program to check if my data are in the correct format?
+May I put comments in data files?
+How to convert other data formats to LIBSVM format?
+The output of training C-SVM is like the following. What do they mean?
+Can you explain more about the model file?
+Should I use float or double to store numbers in the cache ?
+Does libsvm have special treatments for linear SVM?
+The number of free support vectors is large. What should I do?
+Should I scale training and testing data in a similar way?
+On windows sometimes svm-scale.exe generates some non-ASCII data not good for training/prediction?
+Does it make a big difference if I scale each attribute to [0,1] instead of [-1,1]?
+The prediction rate is low. How could I improve it?
+My data are unbalanced. Could libsvm handle such problems?
+What is the difference between nu-SVC and C-SVC?
+The program keeps running (without showing any output). What should I do?
+The program keeps running (with output, i.e. many dots). What should I do?
+The training time is too long. What should I do?
+Does shrinking always help?
+How do I get the decision value(s)?
+How do I get the distance between a point and the hyperplane?
+On 32-bit machines, if I use a large cache (i.e. large -m) on a linux machine, why sometimes I get "segmentation fault ?"
+How do I disable screen output of svm-train?
+I would like to use my own kernel. Any example? In svm.cpp, there are two subroutines for kernel evaluations: k_function() and kernel_function(). Which one should I modify ?
+What method does libsvm use for multi-class SVM ? Why don't you use the "1-against-the rest" method?
+I would like to solve L2-loss SVM (i.e., error term is quadratic). How should I modify the code ?
+In one-class SVM, parameter nu should be an upper bound of the training error rate. Why sometimes I get a training error rate bigger than nu?
+Why the code gives NaN (not a number) results?
+Why the sign of predicted labels and decision values are sometimes reversed?
+I don't know class labels of test data. What should I put in the first column of the test file?
+How can I use OpenMP to parallelize LIBSVM on a multicore/shared-memory computer?
+How could I know which training instances are support vectors?
+Why sv_indices (indices of support vectors) are not stored in the saved model file?
+After doing cross validation, why there is no model file outputted ?
+Why my cross-validation results are different from those in the Practical Guide?
+On some systems CV accuracy is the same in several runs. How could I use different data partitions? In other words, how do I set random seed in LIBSVM?
+Why on windows sometimes grid.py fails?
+Why grid.py/easy.py sometimes generates the following warning message?
+How do I choose the kernel?
+How does LIBSVM perform parameter selection for multi-class problems?
+How do I choose parameters for one-class SVM as training data are in only one class?
+Instead of grid.py, what if I would like to conduct parameter selection using other programmin languages?
+Why training a probability model (i.e., -b 1) takes a longer time?
+Why using the -b option does not give me better accuracy?
+Why using svm-predict -b 0 and -b 1 gives different accuracy values?
+How can I save images drawn by svm-toy?
+I press the "load" button to load data points but why svm-toy does not draw them ?
+I would like svm-toy to handle more than three classes of data, what should I do ?
+What is the difference between Java version and C++ version of libsvm?
+Is the Java version significantly slower than the C++ version?
+While training I get the following error message: java.lang.OutOfMemoryError. What is wrong?
+Why you have the main source file svm.m4 and then transform it to svm.java?
+Except the python-C++ interface provided, could I use Jython to call libsvm ?
+I compile the MATLAB interface without problem, but why errors occur while running it?
+On 64bit Windows I compile the MATLAB interface without problem, but why errors occur while running it?
+Does the MATLAB interface provide a function to do scaling?
+How could I use MATLAB interface for parameter selection?
+I use MATLAB parallel programming toolbox on a multi-core environment for parameter selection. Why the program is even slower?
+How to use LIBSVM with OpenMP under MATLAB/Octave?
+How could I generate the primal variable w of linear SVM?
+Is there an OCTAVE interface for libsvm?
+How to handle the name conflict between svmtrain in the libsvm matlab interface and that in MATLAB bioinformatics toolbox?
+On Windows I got an error message "Invalid MEX-file: Specific module not found" when running the pre-built MATLAB interface in the windows sub-directory. What should I do?
+LIBSVM supports 1-vs-1 multi-class classification. If instead I would like to use 1-vs-rest, how to implement it using MATLAB interface?
+I tried to install matlab interface on mac, but failed. What should I do?
+I tried to install octave interface on windows, but failed. What should I do?
+
+
+
+
+
+
+
+Q: Some courses which have used libsvm as a tool
+
+
+Institute for Computer Science,
+Faculty of Applied Science, University of Freiburg, Germany
+
+
+Division of Mathematics and Computer Science.
+Faculteit der Exacte Wetenschappen
+Vrije Universiteit, The Netherlands.
+
+
+Electrical and Computer Engineering Department,
+University of Wisconsin-Madison
+
+
+
+Technion (Israel Institute of Technology), Israel.
+
+
+Computer and Information Sciences Dept., University of Florida
+
+
+The Institute of Computer Science,
+University of Nairobi, Kenya.
+
+
+Applied Mathematics and Computer Science, University of Iceland.
+
+
+SVM tutorial in machine learning
+summer school, University of Chicago, 2005.
+
+
+
+[Go Top]
+
+
+Q: Some applications/tools which have used libsvm
+
+(and maybe liblinear).
+
+
+[Go Top]
+
+
+Q: Where can I find documents/videos of libsvm ?
+
+
+
+
+
+Official implementation document:
+
+C.-C. Chang and
+C.-J. Lin.
+LIBSVM
+: a library for support vector machines.
+ACM Transactions on Intelligent
+Systems and Technology, 2:27:1--27:27, 2011.
+pdf , ps.gz ,
+ACM digital lib .
+
+
+ Instructions for using LIBSVM are in the README files in the main directory and some sub-directories.
+
+README in the main directory: details all options, data format, and library calls.
+
+tools/README: parameter selection and other tools
+
+A guide for beginners:
+
+C.-W. Hsu, C.-C. Chang, and
+C.-J. Lin.
+
+A practical guide to support vector classification
+
+ An introductory video
+for windows users.
+
+
+
+[Go Top]
+
+
+Q: Where are change log and earlier versions?
+
+See the change log .
+
+
You can download earlier versions
+here .
+
+[Go Top]
+
+
+Q: How to cite LIBSVM?
+
+
+Please cite the following paper:
+
+Chih-Chung Chang and Chih-Jen Lin, LIBSVM
+: a library for support vector machines.
+ACM Transactions on Intelligent Systems and Technology, 2:27:1--27:27, 2011.
+Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm
+
+The bibtex format is
+
+@article{CC01a,
+ author = {Chang, Chih-Chung and Lin, Chih-Jen},
+ title = {{LIBSVM}: A library for support vector machines},
+ journal = {ACM Transactions on Intelligent Systems and Technology},
+ volume = {2},
+ issue = {3},
+ year = {2011},
+ pages = {27:1--27:27},
+ note = {Software available at \url{http://www.csie.ntu.edu.tw/~cjlin/libsvm}}
+}
+
+
+[Go Top]
+
+
+Q: I would like to use libsvm in my software. Is there any license problem?
+
+
+We have "the modified BSD license,"
+so it is very easy to
+use libsvm in your software.
+Please check the COPYRIGHT file in detail. Basically
+you need to
+
+
+Clearly indicate that LIBSVM is used.
+
+
+Retain the LIBSVM COPYRIGHT file in your software.
+
+
+It can also be used in commercial products.
+
+[Go Top]
+
+
+Q: Is there a repository of additional tools based on libsvm?
+
+
+Yes, see libsvm
+tools
+
+[Go Top]
+
+
+Q: On unix machines, I got "error in loading shared libraries" or "cannot open shared object file." What happened ?
+
+
+
+This usually happens if you compile the code
+on one machine and run it on another which has incompatible
+libraries.
+Try to recompile the program on that machine or use static linking.
+
+[Go Top]
+
+
+Q: I have modified the source and would like to build the graphic interface "svm-toy" on MS windows. How should I do it ?
+
+
+
+Build it as a project by choosing "Win32 Project."
+On the other hand, for "svm-train" and "svm-predict"
+you want to choose "Win32 Console Project."
+After libsvm 2.5, you can also use the file Makefile.win.
+See details in README.
+
+
+
+If you are not using Makefile.win and see the following
+link error
+
+LIBCMTD.lib(wwincrt0.obj) : error LNK2001: unresolved external symbol
+_wWinMain@16
+
+you may have selected a wrong project type.
+
+[Go Top]
+
+
+Q: I am an MS windows user but why only one (svm-toy) of those precompiled .exe actually runs ?
+
+
+
+You need to open a command window
+and type svmtrain.exe to see all options.
+Some examples are in README file.
+
+[Go Top]
+
+
+Q: What is the difference between "." and "*" outputed during training?
+
+
+
+"." means every 1,000 iterations (or every #data
+iterations is your #data is less than 1,000).
+"*" means that after iterations of using
+a smaller shrunk problem,
+we reset to use the whole set. See the
+implementation document for details.
+
+[Go Top]
+
+
+Q: Why occasionally the program (including MATLAB or other interfaces) crashes and gives a segmentation fault?
+
+
+
+Very likely the program consumes too much memory than what the
+operating system can provide. Try a smaller data and see if the
+program still crashes.
+
+[Go Top]
+
+
+Q: How to build a dynamic library (.dll file) on MS windows?
+
+
+
+The easiest way is to use Makefile.win.
+See details in README.
+
+Alternatively, you can use Visual C++. Here is
+the example using Visual Studio 2013:
+
+Create a Win32 empty DLL project and set (in Project->$Project_Name
+Properties...->Configuration) to "Release."
+ About how to create a new dynamic link library, please refer to
+http://msdn2.microsoft.com/en-us/library/ms235636(VS.80).aspx
+
+ Add svm.cpp, svm.h to your project.
+ Add __WIN32__ and _CRT_SECURE_NO_DEPRECATE to Preprocessor definitions (in
+Project->$Project_Name Properties...->C/C++->Preprocessor)
+ Set Create/Use Precompiled Header to Not Using Precompiled Headers
+(in Project->$Project_Name Properties...->C/C++->Precompiled Headers)
+ Set the path for the Modulation Definition File svm.def (in
+Project->$Project_Name Properties...->Linker->input
+ Build the DLL.
+ Rename the dll file to libsvm.dll and move it to the correct path.
+
+
+
+
+[Go Top]
+
+
+Q: On some systems (e.g., Ubuntu), compiling LIBSVM gives many warning messages. Is this a problem and how to disable the warning message?
+
+
+
+If you are using a version before 3.18, probably you see
+a warning message like
+
+svm.cpp:2730: warning: ignoring return value of int fscanf(FILE*, const char*, ...), declared with attribute warn_unused_result
+
+This is not a problem; see this page for more
+details of ubuntu systems.
+To disable the warning message you can replace
+
+CFLAGS = -Wall -Wconversion -O3 -fPIC
+
+with
+
+CFLAGS = -Wall -Wconversion -O3 -fPIC -U_FORTIFY_SOURCE
+
+in Makefile.
+ After version 3.18, we have a better setting so that such warning messages do not appear.
+
+[Go Top]
+
+
+Q: In LIBSVM, why you don't use certain C/C++ library functions to make the code shorter?
+
+
+
+For portability, we use only features defined in ISO C89. Note that features in ISO C99 may not be available everywhere.
+Even the newest gcc lacks some features in C99 (see http://gcc.gnu.org/c99status.html for details).
+If the situation changes in the future,
+we might consider using these newer features.
+
+[Go Top]
+
+
+Q: Why sometimes not all attributes of a data appear in the training/model files ?
+
+
+libsvm uses the so called "sparse" format where zero
+values do not need to be stored. Hence a data with attributes
+
+1 0 2 0
+
+is represented as
+
+1:1 3:2
+
+
+[Go Top]
+
+
+Q: What if my data are non-numerical ?
+
+
+Currently libsvm supports only numerical data.
+You may have to change non-numerical data to
+numerical. For example, you can use several
+binary attributes to represent a categorical
+attribute.
+
+[Go Top]
+
+
+Q: Why do you consider sparse format ? Will the training of dense data be much slower ?
+
+
+This is a controversial issue. The kernel
+evaluation (i.e. inner product) of sparse vectors is slower
+so the total training time can be at least twice or three times
+of that using the dense format.
+However, we cannot support only dense format as then we CANNOT
+handle extremely sparse cases. Simplicity of the code is another
+concern. Right now we decide to support
+the sparse format only.
+
+[Go Top]
+
+
+Q: Why sometimes the last line of my data is not read by svm-train?
+
+
+
+We assume that you have '\n' in the end of
+each line. So please press enter in the end
+of your last line.
+
+[Go Top]
+
+
+Q: Is there a program to check if my data are in the correct format?
+
+
+
+The svm-train program in libsvm conducts only a simple check of the input data. To do a
+detailed check, after libsvm 2.85, you can use the python script tools/checkdata.py. See tools/README for details.
+
+[Go Top]
+
+
+Q: May I put comments in data files?
+
+
+
+We don't officially support this. But, currently LIBSVM
+is able to process data in the following
+format:
+
+1 1:2 2:1 # your comments
+
+Note that the character ":" should not appear in your
+comments.
+
+
+[Go Top]
+
+
+Q: How to convert other data formats to LIBSVM format?
+
+
+
+It depends on your data format. A simple way is to use
+libsvmwrite in the libsvm matlab/octave interface.
+
+Take a CSV (comma-separated values) file
+in UCI machine learning repository as an example.
+We download SPECTF.train .
+Labels are in the first column. The following steps produce
+a file in the libsvm format.
+
+matlab> SPECTF = csvread('SPECTF.train'); % read a csv file
+matlab> labels = SPECTF(:, 1); % labels from the 1st column
+matlab> features = SPECTF(:, 2:end);
+matlab> features_sparse = sparse(features); % features must be in a sparse matrix
+matlab> libsvmwrite('SPECTFlibsvm.train', labels, features_sparse);
+
+The tranformed data are stored in SPECTFlibsvm.train.
+
+
+Alternatively, you can use convert.c
+to convert CSV format to libsvm format.
+
+[Go Top]
+
+
+Q: The output of training C-SVM is like the following. What do they mean?
+
+ optimization finished, #iter = 219
+ nu = 0.431030
+ obj = -100.877286, rho = 0.424632
+ nSV = 132, nBSV = 107
+ Total nSV = 132
+
+obj is the optimal objective value of the dual SVM problem.
+rho is the bias term in the decision function
+sgn(w^Tx - rho).
+nSV and nBSV are number of support vectors and bounded support
+vectors (i.e., alpha_i = C). nu-svm is a somewhat equivalent
+form of C-SVM where C is replaced by nu. nu simply shows the
+corresponding parameter. More details are in
+
+libsvm document .
+
+[Go Top]
+
+
+Q: Can you explain more about the model file?
+
+
+
+In the model file, after parameters and other informations such as labels , each line represents a support vector.
+Support vectors are listed in the order of "labels" shown earlier.
+(i.e., those from the first class in the "labels" list are
+grouped first, and so on.)
+If k is the total number of classes,
+in front of a support vector in class j, there are
+k-1 coefficients
+y*alpha where alpha are dual solution of the
+following two class problems:
+
+1 vs j, 2 vs j, ..., j-1 vs j, j vs j+1, j vs j+2, ..., j vs k
+
+and y=1 in first j-1 coefficients, y=-1 in the remaining
+k-j coefficients.
+
+For example, if there are 4 classes, the file looks like:
+
+
++-+-+-+--------------------+
+|1|1|1| |
+|v|v|v| SVs from class 1 |
+|2|3|4| |
++-+-+-+--------------------+
+|1|2|2| |
+|v|v|v| SVs from class 2 |
+|2|3|4| |
++-+-+-+--------------------+
+|1|2|3| |
+|v|v|v| SVs from class 3 |
+|3|3|4| |
++-+-+-+--------------------+
+|1|2|3| |
+|v|v|v| SVs from class 4 |
+|4|4|4| |
++-+-+-+--------------------+
+
+See also
+ an illustration using
+MATLAB/OCTAVE.
+
+[Go Top]
+
+
+Q: Should I use float or double to store numbers in the cache ?
+
+
+
+We have float as the default as you can store more numbers
+in the cache.
+In general this is good enough but for few difficult
+cases (e.g. C very very large) where solutions are huge
+numbers, it might be possible that the numerical precision is not
+enough using only float.
+
+[Go Top]
+
+
+Q: Does libsvm have special treatments for linear SVM?
+
+
+
+
+No, libsvm solves linear/nonlinear SVMs by the
+same way.
+Some tricks may save training/testing time if the
+linear kernel is used,
+so libsvm is NOT particularly efficient for linear SVM,
+especially when
+C is large and
+the number of data is much larger
+than the number of attributes.
+You can either
+
+
+ Please also see our SVM guide
+on the discussion of using RBF and linear
+kernels.
+
+[Go Top]
+
+
+Q: The number of free support vectors is large. What should I do?
+
+
+This usually happens when the data are overfitted.
+If attributes of your data are in large ranges,
+try to scale them. Then the region
+of appropriate parameters may be larger.
+Note that there is a scale program
+in libsvm.
+
+[Go Top]
+
+
+Q: Should I scale training and testing data in a similar way?
+
+
+Yes, you can do the following:
+
+> svm-scale -s scaling_parameters train_data > scaled_train_data
+> svm-scale -r scaling_parameters test_data > scaled_test_data
+
+
+[Go Top]
+
+
+Q: On windows sometimes svm-scale.exe generates some non-ASCII data not good for training/prediction?
+
+
+In general this does not happen, but we have observed in some rare
+situations, the output of svm-scale.exe directed to a file (by ">")
+has wrong encoding. That is, the file is not an ASCII file, so cannot be
+used for training/prediction. Please let us know if this happens as at this moment
+we don't clearly see how to fix the problem.
+
+[Go Top]
+
+
+Q: Does it make a big difference if I scale each attribute to [0,1] instead of [-1,1]?
+
+
+
+For the linear scaling method, if the RBF kernel is
+used and parameter selection is conducted, there
+is no difference. Assume Mi and mi are
+respectively the maximal and minimal values of the
+ith attribute. Scaling to [0,1] means
+
+ x'=(x-mi)/(Mi-mi)
+
+For [-1,1],
+
+ x''=2(x-mi)/(Mi-mi)-1.
+
+In the RBF kernel,
+
+ x'-y'=(x-y)/(Mi-mi), x''-y''=2(x-y)/(Mi-mi).
+
+Hence, using (C,g) on the [0,1]-scaled data is the
+same as (C,g/2) on the [-1,1]-scaled data.
+
+ Though the performance is the same, the computational
+time may be different. For data with many zero entries,
+[0,1]-scaling keeps the sparsity of input data and hence
+may save the time.
+
+[Go Top]
+
+
+Q: The prediction rate is low. How could I improve it?
+
+
+Try to use the model selection tool grid.py in the tools
+directory find
+out good parameters. To see the importance of model selection,
+please
+see our guide for beginners:
+
+A practical guide to support vector
+classification
+
+
+[Go Top]
+
+
+Q: My data are unbalanced. Could libsvm handle such problems?
+
+
+Yes, there is a -wi options. For example, if you use
+
+> svm-train -s 0 -c 10 -w1 1 -w-1 5 data_file
+
+
+the penalty for class "-1" is larger.
+Note that this -w option is for C-SVC only.
+
+[Go Top]
+
+
+Q: What is the difference between nu-SVC and C-SVC?
+
+
+Basically they are the same thing but with different
+parameters. The range of C is from zero to infinity
+but nu is always between [0,1]. A nice property
+of nu is that it is related to the ratio of
+support vectors and the ratio of the training
+error.
+
+[Go Top]
+
+
+Q: The program keeps running (without showing any output). What should I do?
+
+
+You may want to check your data. Each training/testing
+data must be in one line. It cannot be separated.
+In addition, you have to remove empty lines.
+
+[Go Top]
+
+
+Q: The program keeps running (with output, i.e. many dots). What should I do?
+
+
+In theory libsvm guarantees to converge.
+Therefore, this means you are
+handling ill-conditioned situations
+(e.g. too large/small parameters) so numerical
+difficulties occur.
+
+You may get better numerical stability by replacing
+
+typedef float Qfloat;
+
+in svm.cpp with
+
+typedef double Qfloat;
+
+That is, elements in the kernel cache are stored
+in double instead of single. However, this means fewer elements
+can be put in the kernel cache.
+
+[Go Top]
+
+
+Q: The training time is too long. What should I do?
+
+
+For large problems, please specify enough cache size (i.e.,
+-m).
+Slow convergence may happen for some difficult cases (e.g. -c is large).
+You can try to use a looser stopping tolerance with -e.
+If that still doesn't work, you may train only a subset of the data.
+You can use the program subset.py in the directory "tools"
+to obtain a random subset.
+
+
+If you have extremely large data and face this difficulty, please
+contact us. We will be happy to discuss possible solutions.
+
+
When using large -e, you may want to check if -h 0 (no shrinking) or -h 1 (shrinking) is faster.
+See a related question below.
+
+
+[Go Top]
+
+
+Q: Does shrinking always help?
+
+
+If the number of iterations is high, then shrinking
+often helps.
+However, if the number of iterations is small
+(e.g., you specify a large -e), then
+probably using -h 0 (no shrinking) is better.
+See the
+implementation document for details.
+
+[Go Top]
+
+
+Q: How do I get the decision value(s)?
+
+
+We print out decision values for regression. For classification,
+we solve several binary SVMs for multi-class cases. You
+can obtain values by easily calling the subroutine
+svm_predict_values. Their corresponding labels
+can be obtained from svm_get_labels.
+Details are in
+README of libsvm package.
+
+
+If you are using MATLAB/OCTAVE interface, svmpredict can directly
+give you decision values. Please see matlab/README for details.
+
+
+We do not recommend the following. But if you would
+like to get values for
+TWO-class classification with labels +1 and -1
+(note: +1 and -1 but not things like 5 and 10)
+in the easiest way, simply add
+
+ printf("%f\n", dec_values[0]*model->label[0]);
+
+after the line
+
+ svm_predict_values(model, x, dec_values);
+
+of the file svm.cpp.
+Positive (negative)
+decision values correspond to data predicted as +1 (-1).
+
+
+
+[Go Top]
+
+
+Q: How do I get the distance between a point and the hyperplane?
+
+
+The distance is |decision_value| / |w|.
+We have |w|^2 = w^Tw = alpha^T Q alpha = 2*(dual_obj + sum alpha_i).
+Thus in svm.cpp please find the place
+where we calculate the dual objective value
+(i.e., the subroutine Solve())
+and add a statement to print w^Tw.
+
+More precisely, here is what you need to do
+
+Search for "calculate objective value" in svm.cpp
+
+ In that place, si->obj is the variable for the objective value
+
+ Add a for loop to calculate the sum of alpha
+
+ Calculate 2*(si->obj + sum of alpha) and print the square root of it. You now get |w|. You
+need to recompile the code
+
+ Check an earlier FAQ on printing decision values. You
+need to recompile the code
+
+
+Then print decision value divided by the |w| value obtained earlier.
+
+
+
+[Go Top]
+
+
+Q: On 32-bit machines, if I use a large cache (i.e. large -m) on a linux machine, why sometimes I get "segmentation fault ?"
+
+
+
+On 32-bit machines, the maximum addressable
+memory is 4GB. The Linux kernel uses 3:1
+split which means user space is 3G and
+kernel space is 1G. Although there are
+3G user space, the maximum dynamic allocation
+memory is 2G. So, if you specify -m near 2G,
+the memory will be exhausted. And svm-train
+will fail when it asks more memory.
+For more details, please read
+
+this article .
+
+The easiest solution is to switch to a
+ 64-bit machine.
+Otherwise, there are two ways to solve this. If your
+machine supports Intel's PAE (Physical Address
+Extension), you can turn on the option HIGHMEM64G
+in Linux kernel which uses 4G:4G split for
+kernel and user space. If you don't, you can
+try a software `tub' which can eliminate the 2G
+boundary for dynamic allocated memory. The `tub'
+is available at
+http://www.bitwagon.com/tub.html .
+
+
+
+
+[Go Top]
+
+
+Q: How do I disable screen output of svm-train?
+
+
+For commend-line users, use the -q option:
+
+> ./svm-train -q heart_scale
+
+
+For library users, set the global variable
+
+extern void (*svm_print_string) (const char *);
+
+to specify the output format. You can disable the output by the following steps:
+
+
+Declare a function to output nothing:
+
+void print_null(const char *s) {}
+
+
+
+Assign the output function of libsvm by
+
+svm_print_string = &print_null;
+
+
+
+Finally, a way used in earlier libsvm
+is by updating svm.cpp from
+
+#if 1
+void info(const char *fmt,...)
+
+to
+
+#if 0
+void info(const char *fmt,...)
+
+
+[Go Top]
+
+
+Q: I would like to use my own kernel. Any example? In svm.cpp, there are two subroutines for kernel evaluations: k_function() and kernel_function(). Which one should I modify ?
+
+
+An example is "LIBSVM for string data" in LIBSVM Tools.
+
+The reason why we have two functions is as follows.
+For the RBF kernel exp(-g |xi - xj|^2), if we calculate
+xi - xj first and then the norm square, there are 3n operations.
+Thus we consider exp(-g (|xi|^2 - 2dot(xi,xj) +|xj|^2))
+and by calculating all |xi|^2 in the beginning,
+the number of operations is reduced to 2n.
+This is for the training. For prediction we cannot
+do this so a regular subroutine using that 3n operations is
+needed.
+
+The easiest way to have your own kernel is
+to put the same code in these two
+subroutines by replacing any kernel.
+
+[Go Top]
+
+
+Q: What method does libsvm use for multi-class SVM ? Why don't you use the "1-against-the rest" method?
+
+
+It is one-against-one. We chose it after doing the following
+comparison:
+C.-W. Hsu and C.-J. Lin.
+
+A comparison of methods
+for multi-class support vector machines
+ ,
+IEEE Transactions on Neural Networks , 13(2002), 415-425.
+
+
+"1-against-the rest" is a good method whose performance
+is comparable to "1-against-1." We do the latter
+simply because its training time is shorter.
+
+[Go Top]
+
+
+Q: I would like to solve L2-loss SVM (i.e., error term is quadratic). How should I modify the code ?
+
+
+It is extremely easy. Taking c-svc for example, to solve
+
+min_w w^Tw/2 + C \sum max(0, 1- (y_i w^Tx_i+b))^2,
+
+only two
+places of svm.cpp have to be changed.
+First, modify the following line of
+solve_c_svc from
+
+ s.Solve(l, SVC_Q(*prob,*param,y), minus_ones, y,
+ alpha, Cp, Cn, param->eps, si, param->shrinking);
+
+to
+
+ s.Solve(l, SVC_Q(*prob,*param,y), minus_ones, y,
+ alpha, INF, INF, param->eps, si, param->shrinking);
+
+Second, in the class of SVC_Q, declare C as
+a private variable:
+
+ double C;
+
+In the constructor replace
+
+ for(int i=0;i<prob.l;i++)
+ QD[i]= (Qfloat)(this->*kernel_function)(i,i);
+
+with
+
+ this->C = param.C;
+ for(int i=0;i<prob.l;i++)
+ QD[i]= (Qfloat)(this->*kernel_function)(i,i)+0.5/C;
+
+Then in the subroutine get_Q, after the for loop, add
+
+ if(i >= start && i < len)
+ data[i] += 0.5/C;
+
+
+
+For one-class svm, the modification is exactly the same. For SVR, you don't need an if statement like the above. Instead, you only need a simple assignment:
+
+ data[real_i] += 0.5/C;
+
+
+
+
+For large linear L2-loss SVM, please use
+LIBLINEAR .
+
+[Go Top]
+
+
+Q: In one-class SVM, parameter nu should be an upper bound of the training error rate. Why sometimes I get a training error rate bigger than nu?
+
+
+
+At optimum, some training instances should satisfy
+w^Tx - rho = 0. However, numerically they may be slightly
+smaller than zero
+Then they are wrongly counted
+as training errors. You can use a smaller stopping tolerance
+(by the -e option) to make this problem less serious.
+
+
+This issue does not occur for nu-SVC for
+two-class classification.
+We have that
+
+nu is an upper bound on the ratio of training points
+on the wrong side of the hyperplane, and
+ therefore, nu is also an upper bound on the training error rate.
+
+Numerical issues occur in calculating the first case
+because some training points satisfying y(w^Tx + b) - rho = 0
+become negative.
+However, we have no numerical problems for the second case because
+we compare y(w^Tx + b) and 0 for counting training errors.
+
+[Go Top]
+
+
+Q: Why the code gives NaN (not a number) results?
+
+
+This rarely happens, but few users reported the problem.
+It seems that their
+computers for training libsvm have the VPN client
+running. The VPN software has some bugs and causes this
+problem. Please try to close or disconnect the VPN client.
+
+[Go Top]
+
+
+Q: Why the sign of predicted labels and decision values are sometimes reversed?
+
+
+
+This situation may occur before version 3.17 .
+Nothing is wrong. Very likely you have two labels +1/-1 and the first instance in your data
+has -1. We give the following explanation.
+
+
+Internally class labels are ordered by their first occurrence in the training set. For a k-class data, internally labels
+are 0, ..., k-1, and each two-class SVM considers pair
+(i, j) with i < j. Then class i is treated as positive (+1)
+and j as negative (-1).
+For example, if the data set has labels +5/+10 and +10 appears
+first, then internally the +5 versus +10 SVM problem
+has +10 as positive (+1) and +5 as negative (-1).
+
+
+By this setting, if you have labels +1 and -1,
+it's possible that internally they correspond to -1 and +1,
+respectively. Some new users have been confused about
+this, so after version 3.17 , if the data set has only
+two labels +1 and -1,
+internally we ensure +1 to be before -1. Then class +1
+is always treated as positive in the SVM problem.
+Note that this is for two-class data only.
+
+[Go Top]
+
+
+Q: I don't know class labels of test data. What should I put in the first column of the test file?
+
+Any value is ok. In this situation, what you will use is the output file of svm-predict, which gives predicted class labels.
+
+
+
+[Go Top]
+
+
+Q: How can I use OpenMP to parallelize LIBSVM on a multicore/shared-memory computer?
+
+
+It is very easy if you are using GCC 4.2
+or after.
+
+
In Makefile, add -fopenmp to CFLAGS.
+
+
In class SVC_Q of svm.cpp, modify the for loop
+of get_Q to:
+
+#pragma omp parallel for private(j) schedule(guided)
+ for(j=start;j<len;j++)
+
+ In the subroutine svm_predict_values of svm.cpp, add one line to the for loop:
+
+#pragma omp parallel for private(i) schedule(guided)
+ for(i=0;i<l;i++)
+ kvalue[i] = Kernel::k_function(x,model->SV[i],model->param);
+
+For regression, you need to modify
+class SVR_Q instead. The loop in svm_predict_values
+is also different because you need
+a reduction clause for the variable sum:
+
+#pragma omp parallel for private(i) reduction(+:sum) schedule(guided)
+ for(i=0;i<model->l;i++)
+ sum += sv_coef[i] * Kernel::k_function(x,model->SV[i],model->param);
+
+
+ Then rebuild the package. Kernel evaluations in training/testing will be parallelized. An example of running this modification on
+an 8-core machine using the data set
+real-sim :
+
+
8 cores:
+
+%setenv OMP_NUM_THREADS 8
+%time svm-train -c 8 -g 0.5 -m 1000 real-sim
+175.90sec
+
+1 core:
+
+%setenv OMP_NUM_THREADS 1
+%time svm-train -c 8 -g 0.5 -m 1000 real-sim
+588.89sec
+
+For this data, kernel evaluations take 91% of training time. In the above example, we assume you use csh. For bash, use
+
+export OMP_NUM_THREADS=8
+
+instead.
+
+ For Python interface, you need to add the -lgomp link option:
+
+$(CXX) -lgomp -shared -dynamiclib svm.o -o libsvm.so.$(SHVER)
+
+
+ For MS Windows, you need to add /openmp in CFLAGS of Makefile.win
+
+
+[Go Top]
+
+
+Q: How could I know which training instances are support vectors?
+
+
+
+It's very simple. Since version 3.13, you can use the function
+
+void svm_get_sv_indices(const struct svm_model *model, int *sv_indices)
+
+to get indices of support vectors. For example, in svm-train.c, after
+
+ model = svm_train(&prob, ¶m);
+
+you can add
+
+ int nr_sv = svm_get_nr_sv(model);
+ int *sv_indices = Malloc(int, nr_sv);
+ svm_get_sv_indices(model, sv_indices);
+ for (int i=0; i<nr_sv; i++)
+ printf("instance %d is a support vector\n", sv_indices[i]);
+
+
+ If you use matlab interface, you can directly check
+
+model.sv_indices
+
+
+[Go Top]
+
+
+Q: Why sv_indices (indices of support vectors) are not stored in the saved model file?
+
+
+
+Although sv_indices is a member of the model structure
+to
+indicate support vectors in the training set,
+we do not store its contents in the model file.
+The model file is mainly used in the future for
+prediction, so it is basically independent
+from training data. Thus
+storing sv_indices is not necessary.
+Users should find support vectors right after
+the training process. See the previous FAQ.
+
+[Go Top]
+
+
+Q: After doing cross validation, why there is no model file outputted ?
+
+
+Cross validation is used for selecting good parameters.
+After finding them, you want to re-train the whole
+data without the -v option.
+
+[Go Top]
+
+
+Q: Why my cross-validation results are different from those in the Practical Guide?
+
+
+
+Due to random partitions of
+the data, on different systems CV accuracy values
+may be different.
+
+[Go Top]
+
+
+Q: On some systems CV accuracy is the same in several runs. How could I use different data partitions? In other words, how do I set random seed in LIBSVM?
+
+
+If you use GNU C library,
+the default seed 1 is considered. Thus you always
+get the same result of running svm-train -v.
+To have different seeds, you can add the following code
+in svm-train.c:
+
+#include <time.h>
+
+and in the beginning of main(),
+
+srand(time(0));
+
+Alternatively, if you are not using GNU C library
+and would like to use a fixed seed, you can have
+
+srand(1);
+
+
+
+For Java, the random number generator
+is initialized using the time information.
+So results of two CV runs are different.
+To fix the seed, after version 3.1 (released
+in mid 2011), you can add
+
+svm.rand.setSeed(0);
+
+in the main() function of svm_train.java.
+
+
+If you use CV to select parameters, it is recommended to use identical folds
+under different parameters. In this case, you can consider fixing the seed.
+
+[Go Top]
+
+
+Q: Why on windows sometimes grid.py fails?
+
+
+
+This problem shouldn't happen after version
+2.85. If you are using earlier versions,
+please download the latest one.
+
+
+
+[Go Top]
+
+
+Q: Why grid.py/easy.py sometimes generates the following warning message?
+
+
+Warning: empty z range [62.5:62.5], adjusting to [61.875:63.125]
+Notice: cannot contour non grid data!
+
+Nothing is wrong and please disregard the
+message. It is from gnuplot when drawing
+the contour.
+
+[Go Top]
+
+
+Q: How do I choose the kernel?
+
+
+
+In general we suggest you to try the RBF kernel first.
+A recent result by Keerthi and Lin
+(
+download paper here )
+shows that if RBF is used with model selection,
+then there is no need to consider the linear kernel.
+The kernel matrix using sigmoid may not be positive definite
+and in general it's accuracy is not better than RBF.
+(see the paper by Lin and Lin
+(
+download paper here ).
+Polynomial kernels are ok but if a high degree is used,
+numerical difficulties tend to happen
+(thinking about dth power of (<1) goes to 0
+and (>1) goes to infinity).
+
+[Go Top]
+
+
+Q: How does LIBSVM perform parameter selection for multi-class problems?
+
+
+
+LIBSVM implements "one-against-one" multi-class method, so there are
+k(k-1)/2 binary models, where k is the number of classes.
+
+
+We can consider two ways to conduct parameter selection.
+
+
+
+For any two classes of data, a parameter selection procedure is conducted. Finally,
+each decision function has its own optimal parameters.
+
+
+The same parameters are used for all k(k-1)/2 binary classification problems.
+We select parameters that achieve the highest overall performance.
+
+
+
+Each has its own advantages. A
+single parameter set may not be uniformly good for all k(k-1)/2 decision functions.
+However, as the overall accuracy is the final consideration, one parameter set
+for one decision function may lead to over-fitting. In the paper
+
+Chen, Lin, and Schölkopf,
+
+A tutorial on nu-support vector machines.
+
+Applied Stochastic Models in Business and Industry, 21(2005), 111-136,
+
+
+they have experimentally
+shown that the two methods give similar performance.
+Therefore, currently the parameter selection in LIBSVM
+takes the second approach by considering the same parameters for
+all k(k-1)/2 models.
+
+[Go Top]
+
+
+Q: How do I choose parameters for one-class SVM as training data are in only one class?
+
+
+You have pre-specified true positive rate in mind and then search for
+parameters which achieve similar cross-validation accuracy.
+
+[Go Top]
+
+
+Q: Instead of grid.py, what if I would like to conduct parameter selection using other programmin languages?
+
+
+For MATLAB, please see another question in FAQ.
+
+
+For using shell scripts, please check the code written by Bjarte Johansen
+
+[Go Top]
+
+
+Q: Why training a probability model (i.e., -b 1) takes a longer time?
+
+
+To construct this probability model, we internally conduct a
+cross validation, which is more time consuming than
+a regular training.
+Hence, in general you do parameter selection first without
+-b 1. You only use -b 1 when good parameters have been
+selected. In other words, you avoid using -b 1 and -v
+together.
+
+[Go Top]
+
+
+Q: Why using the -b option does not give me better accuracy?
+
+
+There is absolutely no reason the probability outputs guarantee
+you better accuracy. The main purpose of this option is
+to provide you the probability estimates, but not to boost
+prediction accuracy. From our experience,
+after proper parameter selections, in general with
+and without -b have similar accuracy. Occasionally there
+are some differences.
+It is not recommended to compare the two under
+just a fixed parameter
+set as more differences will be observed.
+
+[Go Top]
+
+
+Q: Why using svm-predict -b 0 and -b 1 gives different accuracy values?
+
+
+Let's just consider two-class classification here. After probability information is obtained in training,
+we do not have
+
+prob > = 0.5 if and only if decision value >= 0.
+
+So predictions may be different with -b 0 and 1.
+
+[Go Top]
+
+
+Q: How can I save images drawn by svm-toy?
+
+
+For Microsoft windows, first press the "print screen" key on the keyboard.
+Open "Microsoft Paint"
+(included in Windows)
+and press "ctrl-v." Then you can clip
+the part of picture which you want.
+For X windows, you can
+use the program "xv" or "import" to grab the picture of the svm-toy window.
+
+[Go Top]
+
+
+Q: I press the "load" button to load data points but why svm-toy does not draw them ?
+
+
+The program svm-toy assumes both attributes (i.e. x-axis and y-axis
+values) are in (0,1). Hence you want to scale your
+data to between a small positive number and
+a number less than but very close to 1.
+Moreover, class labels must be 1, 2, or 3
+(not 1.0, 2.0 or anything else).
+
+[Go Top]
+
+
+Q: I would like svm-toy to handle more than three classes of data, what should I do ?
+
+
+Taking windows/svm-toy.cpp as an example, you need to
+modify it and the difference
+from the original file is as the following: (for five classes of
+data)
+
+30,32c30
+< RGB(200,0,200),
+< RGB(0,160,0),
+< RGB(160,0,0)
+---
+> RGB(200,0,200)
+39c37
+< HBRUSH brush1, brush2, brush3, brush4, brush5;
+---
+> HBRUSH brush1, brush2, brush3;
+113,114d110
+< brush4 = CreateSolidBrush(colors[7]);
+< brush5 = CreateSolidBrush(colors[8]);
+155,157c151
+< else if(v==3) return brush3;
+< else if(v==4) return brush4;
+< else return brush5;
+---
+> else return brush3;
+325d318
+< int colornum = 5;
+327c320
+< svm_node *x_space = new svm_node[colornum * prob.l];
+---
+> svm_node *x_space = new svm_node[3 * prob.l];
+333,338c326,331
+< x_space[colornum * i].index = 1;
+< x_space[colornum * i].value = q->x;
+< x_space[colornum * i + 1].index = 2;
+< x_space[colornum * i + 1].value = q->y;
+< x_space[colornum * i + 2].index = -1;
+< prob.x[i] = &x_space[colornum * i];
+---
+> x_space[3 * i].index = 1;
+> x_space[3 * i].value = q->x;
+> x_space[3 * i + 1].index = 2;
+> x_space[3 * i + 1].value = q->y;
+> x_space[3 * i + 2].index = -1;
+> prob.x[i] = &x_space[3 * i];
+397c390
+< if(current_value > 5) current_value = 1;
+---
+> if(current_value > 3) current_value = 1;
+
+
+[Go Top]
+
+
+Q: What is the difference between Java version and C++ version of libsvm?
+
+
+They are the same thing. We just rewrote the C++ code
+in Java.
+
+[Go Top]
+
+
+Q: Is the Java version significantly slower than the C++ version?
+
+
+This depends on the VM you used. We have seen good
+VM which leads the Java version to be quite competitive with
+the C++ code. (though still slower)
+
+[Go Top]
+
+
+Q: While training I get the following error message: java.lang.OutOfMemoryError. What is wrong?
+
+
+You should try to increase the maximum Java heap size.
+For example,
+
+java -Xmx2048m -classpath libsvm.jar svm_train ...
+
+sets the maximum heap size to 2048M.
+
+[Go Top]
+
+
+Q: Why you have the main source file svm.m4 and then transform it to svm.java?
+
+
+Unlike C, Java does not have a preprocessor built-in.
+However, we need some macros (see first 3 lines of svm.m4).
+
+
+
+[Go Top]
+
+
+Q: Except the python-C++ interface provided, could I use Jython to call libsvm ?
+
+ Yes, here are some examples:
+
+
+$ export CLASSPATH=$CLASSPATH:~/libsvm-2.91/java/libsvm.jar
+$ ./jython
+Jython 2.1a3 on java1.3.0 (JIT: jitc)
+Type "copyright", "credits" or "license" for more information.
+>>> from libsvm import *
+>>> dir()
+['__doc__', '__name__', 'svm', 'svm_model', 'svm_node', 'svm_parameter',
+'svm_problem']
+>>> x1 = [svm_node(index=1,value=1)]
+>>> x2 = [svm_node(index=1,value=-1)]
+>>> param = svm_parameter(svm_type=0,kernel_type=2,gamma=1,cache_size=40,eps=0.001,C=1,nr_weight=0,shrinking=1)
+>>> prob = svm_problem(l=2,y=[1,-1],x=[x1,x2])
+>>> model = svm.svm_train(prob,param)
+*
+optimization finished, #iter = 1
+nu = 1.0
+obj = -1.018315639346838, rho = 0.0
+nSV = 2, nBSV = 2
+Total nSV = 2
+>>> svm.svm_predict(model,x1)
+1.0
+>>> svm.svm_predict(model,x2)
+-1.0
+>>> svm.svm_save_model("test.model",model)
+
+
+
+
+[Go Top]
+
+
+Q: I compile the MATLAB interface without problem, but why errors occur while running it?
+
+
+Your compiler version may not be supported/compatible for MATLAB.
+Please check this MATLAB page first and then specify the version
+number. For example, if g++ X.Y is supported, replace
+
+CXX = g++
+
+in the Makefile with
+
+CXX = g++-X.Y
+
+
+[Go Top]
+
+
+Q: On 64bit Windows I compile the MATLAB interface without problem, but why errors occur while running it?
+
+
+
+
+Please make sure that you use
+the -largeArrayDims option in make.m. For example,
+
+mex -largeArrayDims -O -c svm.cpp
+
+
+Moreover, if you use Microsoft Visual Studio,
+probabally it is not properly installed.
+See the explanation
+here .
+
+[Go Top]
+
+
+Q: Does the MATLAB interface provide a function to do scaling?
+
+
+It is extremely easy to do scaling under MATLAB.
+The following one-line code scale each feature to the range
+of [0,1]:
+
+(data - repmat(min(data,[],1),size(data,1),1))*spdiags(1./(max(data,[],1)-min(data,[],1))',0,size(data,2),size(data,2))
+
+
+[Go Top]
+
+
+Q: How could I use MATLAB interface for parameter selection?
+
+
+One can do this by a simple loop.
+See the following example:
+
+bestcv = 0;
+for log2c = -1:3,
+ for log2g = -4:1,
+ cmd = ['-v 5 -c ', num2str(2^log2c), ' -g ', num2str(2^log2g)];
+ cv = svmtrain(heart_scale_label, heart_scale_inst, cmd);
+ if (cv >= bestcv),
+ bestcv = cv; bestc = 2^log2c; bestg = 2^log2g;
+ end
+ fprintf('%g %g %g (best c=%g, g=%g, rate=%g)\n', log2c, log2g, cv, bestc, bestg, bestcv);
+ end
+end
+
+You may adjust the parameter range in the above loops.
+
+[Go Top]
+
+
+Q: I use MATLAB parallel programming toolbox on a multi-core environment for parameter selection. Why the program is even slower?
+
+
+Fabrizio Lacalandra of University of Pisa reported this issue.
+It seems the problem is caused by the screen output.
+If you disable the info function
+using
#if 0, then the problem
+may be solved.
+
+[Go Top]
+
+
+Q: How to use LIBSVM with OpenMP under MATLAB/Octave?
+
+
+
+First, you must modify svm.cpp. Check the following faq,
+
+How can I use OpenMP to parallelize LIBSVM on a multicore/shared-memory computer?
+
+
+To build the MATLAB/Octave interface, we recommend using make.m .
+You must append '-fopenmp' to CXXFLAGS and add '-lgomp' to mex options in make.m .
+See details below.
+
+
+For MATLAB users, the modified code is:
+
+mex CFLAGS="\$CFLAGS -std=c99" CXXFLAGS="\$CXXFLAGS -fopenmp" -largeArrayDims -I.. -lgomp svmtrain.c ../svm.cpp svm_model_matlab.c
+mex CFLAGS="\$CFLAGS -std=c99" CXXFLAGS="\$CXXFLAGS -fopenmp" -largeArrayDims -I.. -lgomp svmpredict.c ../svm.cpp svm_model_matlab.c
+
+
+
+For Octave users, the modified code is:
+
+setenv('CXXFLAGS', '-fopenmp')
+mex -I.. -lgomp svmtrain.c ../svm.cpp svm_model_matlab.c
+mex -I.. -lgomp svmpredict.c ../svm.cpp svm_model_matlab.c
+
+
+
+If make.m fails under matlab and you use Makefile to compile the codes,
+you must modify two files:
+
+
+You must append '-fopenmp' to CFLAGS in ../Makefile for C/C++ codes:
+
+CFLAGS = -Wall -Wconversion -O3 -fPIC -fopenmp -I$(MATLABDIR)/extern/include -I..
+
+and add '-lgomp' to MEX_OPTION in Makefile for the matlab/octave interface:
+
+MEX_OPTION += -lgomp
+
+
+
+ To run the code, you must specify the number of threads. For
+ example, before executing matlab/octave, you run
+
+> export OMP_NUM_THREADS=8
+> matlab
+
+Here we assume Bash is used. Unfortunately, we do not know yet
+how to specify the number of threads within MATLAB/Octave. Our
+experiments show that
+
+>> setenv('OMP_NUM_THREADS', '8');
+
+does not work. Please contact us if you
+see how to solve this problem. On the other hand, you can
+specify the number of threads in the source code (thanks
+to comments from Ricardo Santiago-mozos):
+
+#pragma omp parallel for private(i) num_threads(8)
+
+
+[Go Top]
+
+
+Q: How could I generate the primal variable w of linear SVM?
+
+
+Let's start from the binary class and
+assume you have two labels -1 and +1.
+After obtaining the model from calling svmtrain,
+do the following to have w and b:
+
+w = model.SVs' * model.sv_coef;
+b = -model.rho;
+
+if model.Label(1) == -1
+ w = -w;
+ b = -b;
+end
+
+If you do regression or one-class SVM, then the if statement is not needed.
+
+ For multi-class SVM, we illustrate the setting
+in the following example of running the iris
+data, which have 3 classes
+
+> [y, x] = libsvmread('../../htdocs/libsvmtools/datasets/multiclass/iris.scale');
+> m = svmtrain(y, x, '-t 0')
+
+m =
+
+ Parameters: [5x1 double]
+ nr_class: 3
+ totalSV: 42
+ rho: [3x1 double]
+ Label: [3x1 double]
+ ProbA: []
+ ProbB: []
+ nSV: [3x1 double]
+ sv_coef: [42x2 double]
+ SVs: [42x4 double]
+
+sv_coef is like:
+
++-+-+--------------------+
+|1|1| |
+|v|v| SVs from class 1 |
+|2|3| |
++-+-+--------------------+
+|1|2| |
+|v|v| SVs from class 2 |
+|2|3| |
++-+-+--------------------+
+|1|2| |
+|v|v| SVs from class 3 |
+|3|3| |
++-+-+--------------------+
+
+so we need to see nSV of each classes.
+
+> m.nSV
+
+ans =
+
+ 3
+ 21
+ 18
+
+Suppose the goal is to find the vector w of classes
+1 vs 3. Then
+y_i alpha_i of training 1 vs 3 are
+
+> coef = [m.sv_coef(1:3,2); m.sv_coef(25:42,1)];
+
+and SVs are:
+
+> SVs = [m.SVs(1:3,:); m.SVs(25:42,:)];
+
+Hence, w is
+
+> w = SVs'*coef;
+
+For rho,
+
+> m.rho
+
+ans =
+
+ 1.1465
+ 0.3682
+ -1.9969
+> b = -m.rho(2);
+
+because rho is arranged by 1vs2 1vs3 2vs3.
+
+
+
+
+[Go Top]
+
+
+Q: Is there an OCTAVE interface for libsvm?
+
+
+Yes, after libsvm 2.86, the matlab interface
+works on OCTAVE as well. Please use make.m by typing
+
+>> make
+
+under OCTAVE.
+
+[Go Top]
+
+
+Q: How to handle the name conflict between svmtrain in the libsvm matlab interface and that in MATLAB bioinformatics toolbox?
+
+
+The easiest way is to rename the svmtrain binary
+file (e.g., svmtrain.mexw32 on 32-bit windows)
+to a different
+name (e.g., svmtrain2.mexw32).
+
+[Go Top]
+
+
+Q: On Windows I got an error message "Invalid MEX-file: Specific module not found" when running the pre-built MATLAB interface in the windows sub-directory. What should I do?
+
+
+
+The error usually happens
+when there are missing runtime components
+such as MSVCR100.dll on your Windows platform.
+You can use tools such as
+Dependency
+Walker to find missing library files.
+
+
+For example, if the pre-built MEX files are compiled by
+Visual C++ 2010,
+you must have installed
+Microsoft Visual C++ Redistributable Package 2010
+(vcredist_x86.exe). You can easily find the freely
+available file from Microsoft's web site.
+
+
+For 64bit Windows, the situation is similar. If
+the pre-built files are by
+Visual C++ 2008, then you must have
+Microsoft Visual C++ Redistributable Package 2008
+(vcredist_x64.exe).
+
+[Go Top]
+
+
+Q: LIBSVM supports 1-vs-1 multi-class classification. If instead I would like to use 1-vs-rest, how to implement it using MATLAB interface?
+
+
+
+Please use code in the following directory . The following example shows how to
+train and test the problem dna (training and testing ).
+
+
Load, train and predict data:
+
+[trainY trainX] = libsvmread('./dna.scale');
+[testY testX] = libsvmread('./dna.scale.t');
+model = ovrtrain(trainY, trainX, '-c 8 -g 4');
+[pred ac decv] = ovrpredict(testY, testX, model);
+fprintf('Accuracy = %g%%\n', ac * 100);
+
+Conduct CV on a grid of parameters
+
+bestcv = 0;
+for log2c = -1:2:3,
+ for log2g = -4:2:1,
+ cmd = ['-q -c ', num2str(2^log2c), ' -g ', num2str(2^log2g)];
+ cv = get_cv_ac(trainY, trainX, cmd, 3);
+ if (cv >= bestcv),
+ bestcv = cv; bestc = 2^log2c; bestg = 2^log2g;
+ end
+ fprintf('%g %g %g (best c=%g, g=%g, rate=%g)\n', log2c, log2g, cv, bestc, bestg, bestcv);
+ end
+end
+
+
+[Go Top]
+
+
+Q: I tried to install matlab interface on mac, but failed. What should I do?
+
+
+
+We assume that in a matlab command window you change directory to libsvm/matlab and type
+
+>> make
+
+We discuss the following situations.
+
+
+An error message like "libsvmread.c:1:19: fatal error:
+stdio.h: No such file or directory" appears.
+
+
+Reason: "make" looks for a C++ compiler, but
+no compiler is found. To get one, you can
+
+ Install XCode offered by Apple Inc.
+ Install XCode Command Line Tools.
+
+
+
+
On OS X with Xcode 4.2+, I got an error message like "llvm-gcc-4.2:
+command not found."
+
+
+Reason: Since Apple Inc. only ships llsvm-gcc instead of gcc-4.2,
+llvm-gcc-4.2 cannot be found.
+
+
+If you are using Xcode 4.2-4.6,
+a related solution is offered at
+http://www.mathworks.com/matlabcentral/answers/94092 .
+
+
+On the other hand, for Xcode 5 (including Xcode 4.2-4.6), in a Matlab command window, enter
+
+
+Please also ensure that SDKROOT corresponds to the SDK version you are using.
+
+
+
Other errors: you may check http://www.mathworks.com/matlabcentral/answers/94092 .
+
+
+
+[Go Top]
+
+
+Q: I tried to install octave interface on windows, but failed. What should I do?
+
+
+
+This may be due to
+that Octave's math.h file does not
+refer to the correct location of Visual Studio's math.h.
+Please see this nice page for detailed
+instructions.
+
+[Go Top]
+
+
+LIBSVM home page
+
+
+
diff --git a/src/backend/app/algorithms/evaluate/libsvm/Makefile b/src/backend/app/algorithms/evaluate/libsvm/Makefile
new file mode 100644
index 0000000..db6ab34
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/Makefile
@@ -0,0 +1,25 @@
+CXX ?= g++
+CFLAGS = -Wall -Wconversion -O3 -fPIC
+SHVER = 2
+OS = $(shell uname)
+
+all: svm-train svm-predict svm-scale
+
+lib: svm.o
+ if [ "$(OS)" = "Darwin" ]; then \
+ SHARED_LIB_FLAG="-dynamiclib -Wl,-install_name,libsvm.so.$(SHVER)"; \
+ else \
+ SHARED_LIB_FLAG="-shared -Wl,-soname,libsvm.so.$(SHVER)"; \
+ fi; \
+ $(CXX) $${SHARED_LIB_FLAG} svm.o -o libsvm.so.$(SHVER)
+
+svm-predict: svm-predict.c svm.o
+ $(CXX) $(CFLAGS) svm-predict.c svm.o -o svm-predict -lm
+svm-train: svm-train.c svm.o
+ $(CXX) $(CFLAGS) svm-train.c svm.o -o svm-train -lm
+svm-scale: svm-scale.c
+ $(CXX) $(CFLAGS) svm-scale.c -o svm-scale
+svm.o: svm.cpp svm.h
+ $(CXX) $(CFLAGS) -c svm.cpp
+clean:
+ rm -f *~ svm.o svm-train svm-predict svm-scale libsvm.so.$(SHVER)
diff --git a/src/backend/app/algorithms/evaluate/libsvm/Makefile.win b/src/backend/app/algorithms/evaluate/libsvm/Makefile.win
new file mode 100644
index 0000000..b1d3570
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/Makefile.win
@@ -0,0 +1,33 @@
+#You must ensure nmake.exe, cl.exe, link.exe are in system path.
+#VCVARS64.bat
+#Under dosbox prompt
+#nmake -f Makefile.win
+
+##########################################
+CXX = cl.exe
+CFLAGS = /nologo /O2 /EHsc /I. /D _WIN64 /D _CRT_SECURE_NO_DEPRECATE
+TARGET = windows
+
+all: $(TARGET)\svm-train.exe $(TARGET)\svm-predict.exe $(TARGET)\svm-scale.exe $(TARGET)\svm-toy.exe lib
+
+$(TARGET)\svm-predict.exe: svm.h svm-predict.c svm.obj
+ $(CXX) $(CFLAGS) svm-predict.c svm.obj -Fe$(TARGET)\svm-predict.exe
+
+$(TARGET)\svm-train.exe: svm.h svm-train.c svm.obj
+ $(CXX) $(CFLAGS) svm-train.c svm.obj -Fe$(TARGET)\svm-train.exe
+
+$(TARGET)\svm-scale.exe: svm.h svm-scale.c
+ $(CXX) $(CFLAGS) svm-scale.c -Fe$(TARGET)\svm-scale.exe
+
+$(TARGET)\svm-toy.exe: svm.h svm.obj svm-toy\windows\svm-toy.cpp
+ $(CXX) $(CFLAGS) svm-toy\windows\svm-toy.cpp svm.obj user32.lib gdi32.lib comdlg32.lib -Fe$(TARGET)\svm-toy.exe
+
+svm.obj: svm.cpp svm.h
+ $(CXX) $(CFLAGS) -c svm.cpp
+
+lib: svm.cpp svm.h svm.def
+ $(CXX) $(CFLAGS) -LD svm.cpp -Fe$(TARGET)\libsvm -link -DEF:svm.def
+
+clean:
+ -erase /Q *.obj $(TARGET)\*.exe $(TARGET)\*.dll $(TARGET)\*.exp $(TARGET)\*.lib
+
diff --git a/src/backend/app/algorithms/evaluate/libsvm/README b/src/backend/app/algorithms/evaluate/libsvm/README
new file mode 100644
index 0000000..5b32236
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/README
@@ -0,0 +1,769 @@
+Libsvm is a simple, easy-to-use, and efficient software for SVM
+classification and regression. It solves C-SVM classification, nu-SVM
+classification, one-class-SVM, epsilon-SVM regression, and nu-SVM
+regression. It also provides an automatic model selection tool for
+C-SVM classification. This document explains the use of libsvm.
+
+Libsvm is available at
+http://www.csie.ntu.edu.tw/~cjlin/libsvm
+Please read the COPYRIGHT file before using libsvm.
+
+Table of Contents
+=================
+
+- Quick Start
+- Installation and Data Format
+- `svm-train' Usage
+- `svm-predict' Usage
+- `svm-scale' Usage
+- Tips on Practical Use
+- Examples
+- Precomputed Kernels
+- Library Usage
+- Java Version
+- Building Windows Binaries
+- Additional Tools: Sub-sampling, Parameter Selection, Format checking, etc.
+- MATLAB/OCTAVE Interface
+- Python Interface
+- Additional Information
+
+Quick Start
+===========
+
+If you are new to SVM and if the data is not large, please go to
+`tools' directory and use easy.py after installation. It does
+everything automatic -- from data scaling to parameter selection.
+
+Usage: easy.py training_file [testing_file]
+
+More information about parameter selection can be found in
+`tools/README.'
+
+Installation and Data Format
+============================
+
+On Unix systems, type `make' to build the `svm-train' and `svm-predict'
+programs. Run them without arguments to show the usages of them.
+
+On other systems, consult `Makefile' to build them (e.g., see
+'Building Windows binaries' in this file) or use the pre-built
+binaries (Windows binaries are in the directory `windows').
+
+The format of training and testing data file is:
+
+ : : ...
+.
+.
+.
+
+Each line contains an instance and is ended by a '\n' character. For
+classification, is an integer indicating the class label
+(multi-class is supported). For regression, is the target
+value which can be any real number. For one-class SVM, it's not used
+so can be any number. The pair : gives a feature
+(attribute) value: is an integer starting from 1 and
+is a real number. The only exception is the precomputed kernel, where
+ starts from 0; see the section of precomputed kernels. Indices
+must be in ASCENDING order. Labels in the testing file are only used
+to calculate accuracy or errors. If they are unknown, just fill the
+first column with any numbers.
+
+A sample classification data included in this package is
+`heart_scale'. To check if your data is in a correct form, use
+`tools/checkdata.py' (details in `tools/README').
+
+Type `svm-train heart_scale', and the program will read the training
+data and output the model file `heart_scale.model'. If you have a test
+set called heart_scale.t, then type `svm-predict heart_scale.t
+heart_scale.model output' to see the prediction accuracy. The `output'
+file contains the predicted class labels.
+
+For classification, if training data are in only one class (i.e., all
+labels are the same), then `svm-train' issues a warning message:
+`Warning: training data in only one class. See README for details,'
+which means the training data is very unbalanced. The label in the
+training data is directly returned when testing.
+
+There are some other useful programs in this package.
+
+svm-scale:
+
+ This is a tool for scaling input data file.
+
+svm-toy:
+
+ This is a simple graphical interface which shows how SVM
+ separate data in a plane. You can click in the window to
+ draw data points. Use "change" button to choose class
+ 1, 2 or 3 (i.e., up to three classes are supported), "load"
+ button to load data from a file, "save" button to save data to
+ a file, "run" button to obtain an SVM model, and "clear"
+ button to clear the window.
+
+ You can enter options in the bottom of the window, the syntax of
+ options is the same as `svm-train'.
+
+ Note that "load" and "save" consider dense data format both in
+ classification and the regression cases. For classification,
+ each data point has one label (the color) that must be 1, 2,
+ or 3 and two attributes (x-axis and y-axis values) in
+ [0,1). For regression, each data point has one target value
+ (y-axis) and one attribute (x-axis values) in [0, 1).
+
+ Type `make' in respective directories to build them.
+
+ You need Qt library to build the Qt version.
+ (available from http://www.trolltech.com)
+
+ You need GTK+ library to build the GTK version.
+ (available from http://www.gtk.org)
+
+ The pre-built Windows binaries are in the `windows'
+ directory. We use Visual C++ on a 64-bit machine.
+
+`svm-train' Usage
+=================
+
+Usage: svm-train [options] training_set_file [model_file]
+options:
+-s svm_type : set type of SVM (default 0)
+ 0 -- C-SVC (multi-class classification)
+ 1 -- nu-SVC (multi-class classification)
+ 2 -- one-class SVM
+ 3 -- epsilon-SVR (regression)
+ 4 -- nu-SVR (regression)
+-t kernel_type : set type of kernel function (default 2)
+ 0 -- linear: u'*v
+ 1 -- polynomial: (gamma*u'*v + coef0)^degree
+ 2 -- radial basis function: exp(-gamma*|u-v|^2)
+ 3 -- sigmoid: tanh(gamma*u'*v + coef0)
+ 4 -- precomputed kernel (kernel values in training_set_file)
+-d degree : set degree in kernel function (default 3)
+-g gamma : set gamma in kernel function (default 1/num_features)
+-r coef0 : set coef0 in kernel function (default 0)
+-c cost : set the parameter C of C-SVC, epsilon-SVR, and nu-SVR (default 1)
+-n nu : set the parameter nu of nu-SVC, one-class SVM, and nu-SVR (default 0.5)
+-p epsilon : set the epsilon in loss function of epsilon-SVR (default 0.1)
+-m cachesize : set cache memory size in MB (default 100)
+-e epsilon : set tolerance of termination criterion (default 0.001)
+-h shrinking : whether to use the shrinking heuristics, 0 or 1 (default 1)
+-b probability_estimates : whether to train a SVC or SVR model for probability estimates, 0 or 1 (default 0)
+-wi weight : set the parameter C of class i to weight*C, for C-SVC (default 1)
+-v n: n-fold cross validation mode
+-q : quiet mode (no outputs)
+
+
+The k in the -g option means the number of attributes in the input data.
+
+option -v randomly splits the data into n parts and calculates cross
+validation accuracy/mean squared error on them.
+
+See libsvm FAQ for the meaning of outputs.
+
+`svm-predict' Usage
+===================
+
+Usage: svm-predict [options] test_file model_file output_file
+options:
+-b probability_estimates: whether to predict probability estimates, 0 or 1 (default 0); for one-class SVM only 0 is supported
+
+model_file is the model file generated by svm-train.
+test_file is the test data you want to predict.
+svm-predict will produce output in the output_file.
+
+`svm-scale' Usage
+=================
+
+Usage: svm-scale [options] data_filename
+options:
+-l lower : x scaling lower limit (default -1)
+-u upper : x scaling upper limit (default +1)
+-y y_lower y_upper : y scaling limits (default: no y scaling)
+-s save_filename : save scaling parameters to save_filename
+-r restore_filename : restore scaling parameters from restore_filename
+
+See 'Examples' in this file for examples.
+
+Tips on Practical Use
+=====================
+
+* Scale your data. For example, scale each attribute to [0,1] or [-1,+1].
+* For C-SVC, consider using the model selection tool in the tools directory.
+* nu in nu-SVC/one-class-SVM/nu-SVR approximates the fraction of training
+ errors and support vectors.
+* If data for classification are unbalanced (e.g. many positive and
+ few negative), try different penalty parameters C by -wi (see
+ examples below).
+* Specify larger cache size (i.e., larger -m) for huge problems.
+
+Examples
+========
+
+> svm-scale -l -1 -u 1 -s range train > train.scale
+> svm-scale -r range test > test.scale
+
+Scale each feature of the training data to be in [-1,1]. Scaling
+factors are stored in the file range and then used for scaling the
+test data.
+
+> svm-train -s 0 -c 5 -t 2 -g 0.5 -e 0.1 data_file
+
+Train a classifier with RBF kernel exp(-0.5|u-v|^2), C=10, and
+stopping tolerance 0.1.
+
+> svm-train -s 3 -p 0.1 -t 0 data_file
+
+Solve SVM regression with linear kernel u'v and epsilon=0.1
+in the loss function.
+
+> svm-train -c 10 -w1 1 -w-2 5 -w4 2 data_file
+
+Train a classifier with penalty 10 = 1 * 10 for class 1, penalty 50 =
+5 * 10 for class -2, and penalty 20 = 2 * 10 for class 4.
+
+> svm-train -s 0 -c 100 -g 0.1 -v 5 data_file
+
+Do five-fold cross validation for the classifier using
+the parameters C = 100 and gamma = 0.1
+
+> svm-train -s 0 -b 1 data_file
+> svm-predict -b 1 test_file data_file.model output_file
+
+Obtain a model with probability information and predict test data with
+probability estimates
+
+Precomputed Kernels
+===================
+
+Users may precompute kernel values and input them as training and
+testing files. Then libsvm does not need the original
+training/testing sets.
+
+Assume there are L training instances x1, ..., xL and.
+Let K(x, y) be the kernel
+value of two instances x and y. The input formats
+are:
+
+New training instance for xi:
+
+ 0:i 1:K(xi,x1) ... L:K(xi,xL)
+
+New testing instance for any x:
+
+ 0:? 1:K(x,x1) ... L:K(x,xL)
+
+That is, in the training file the first column must be the "ID" of
+xi. In testing, ? can be any value.
+
+All kernel values including ZEROs must be explicitly provided. Any
+permutation or random subsets of the training/testing files are also
+valid (see examples below).
+
+Note: the format is slightly different from the precomputed kernel
+package released in libsvmtools earlier.
+
+Examples:
+
+ Assume the original training data has three four-feature
+ instances and testing data has one instance:
+
+ 15 1:1 2:1 3:1 4:1
+ 45 2:3 4:3
+ 25 3:1
+
+ 15 1:1 3:1
+
+ If the linear kernel is used, we have the following new
+ training/testing sets:
+
+ 15 0:1 1:4 2:6 3:1
+ 45 0:2 1:6 2:18 3:0
+ 25 0:3 1:1 2:0 3:1
+
+ 15 0:? 1:2 2:0 3:1
+
+ ? can be any value.
+
+ Any subset of the above training file is also valid. For example,
+
+ 25 0:3 1:1 2:0 3:1
+ 45 0:2 1:6 2:18 3:0
+
+ implies that the kernel matrix is
+
+ [K(2,2) K(2,3)] = [18 0]
+ [K(3,2) K(3,3)] = [0 1]
+
+Library Usage
+=============
+
+These functions and structures are declared in the header file
+`svm.h'. You need to #include "svm.h" in your C/C++ source files and
+link your program with `svm.cpp'. You can see `svm-train.c' and
+`svm-predict.c' for examples showing how to use them. We define
+LIBSVM_VERSION and declare `extern int libsvm_version; ' in svm.h, so
+you can check the version number.
+
+Before you classify test data, you need to construct an SVM model
+(`svm_model') using training data. A model can also be saved in
+a file for later use. Once an SVM model is available, you can use it
+to classify new data.
+
+- Function: struct svm_model *svm_train(const struct svm_problem *prob,
+ const struct svm_parameter *param);
+
+ This function constructs and returns an SVM model according to
+ the given training data and parameters.
+
+ struct svm_problem describes the problem:
+
+ struct svm_problem
+ {
+ int l;
+ double *y;
+ struct svm_node **x;
+ };
+
+ where `l' is the number of training data, and `y' is an array containing
+ their target values. (integers in classification, real numbers in
+ regression) `x' is an array of pointers, each of which points to a sparse
+ representation (array of svm_node) of one training vector.
+
+ For example, if we have the following training data:
+
+ LABEL ATTR1 ATTR2 ATTR3 ATTR4 ATTR5
+ ----- ----- ----- ----- ----- -----
+ 1 0 0.1 0.2 0 0
+ 2 0 0.1 0.3 -1.2 0
+ 1 0.4 0 0 0 0
+ 2 0 0.1 0 1.4 0.5
+ 3 -0.1 -0.2 0.1 1.1 0.1
+
+ then the components of svm_problem are:
+
+ l = 5
+
+ y -> 1 2 1 2 3
+
+ x -> [ ] -> (2,0.1) (3,0.2) (-1,?)
+ [ ] -> (2,0.1) (3,0.3) (4,-1.2) (-1,?)
+ [ ] -> (1,0.4) (-1,?)
+ [ ] -> (2,0.1) (4,1.4) (5,0.5) (-1,?)
+ [ ] -> (1,-0.1) (2,-0.2) (3,0.1) (4,1.1) (5,0.1) (-1,?)
+
+ where (index,value) is stored in the structure `svm_node':
+
+ struct svm_node
+ {
+ int index;
+ double value;
+ };
+
+ index = -1 indicates the end of one vector. Note that indices must
+ be in ASCENDING order.
+
+ struct svm_parameter describes the parameters of an SVM model:
+
+ struct svm_parameter
+ {
+ int svm_type;
+ int kernel_type;
+ int degree; /* for poly */
+ double gamma; /* for poly/rbf/sigmoid */
+ double coef0; /* for poly/sigmoid */
+
+ /* these are for training only */
+ double cache_size; /* in MB */
+ double eps; /* stopping criteria */
+ double C; /* for C_SVC, EPSILON_SVR, and NU_SVR */
+ int nr_weight; /* for C_SVC */
+ int *weight_label; /* for C_SVC */
+ double* weight; /* for C_SVC */
+ double nu; /* for NU_SVC, ONE_CLASS, and NU_SVR */
+ double p; /* for EPSILON_SVR */
+ int shrinking; /* use the shrinking heuristics */
+ int probability; /* do probability estimates */
+ };
+
+ svm_type can be one of C_SVC, NU_SVC, ONE_CLASS, EPSILON_SVR, NU_SVR.
+
+ C_SVC: C-SVM classification
+ NU_SVC: nu-SVM classification
+ ONE_CLASS: one-class-SVM
+ EPSILON_SVR: epsilon-SVM regression
+ NU_SVR: nu-SVM regression
+
+ kernel_type can be one of LINEAR, POLY, RBF, SIGMOID.
+
+ LINEAR: u'*v
+ POLY: (gamma*u'*v + coef0)^degree
+ RBF: exp(-gamma*|u-v|^2)
+ SIGMOID: tanh(gamma*u'*v + coef0)
+ PRECOMPUTED: kernel values in training_set_file
+
+ cache_size is the size of the kernel cache, specified in megabytes.
+ C is the cost of constraints violation.
+ eps is the stopping criterion. (we usually use 0.00001 in nu-SVC,
+ 0.001 in others). nu is the parameter in nu-SVM, nu-SVR, and
+ one-class-SVM. p is the epsilon in epsilon-insensitive loss function
+ of epsilon-SVM regression. shrinking = 1 means shrinking is conducted;
+ = 0 otherwise. probability = 1 means model with probability
+ information is obtained; = 0 otherwise.
+
+ nr_weight, weight_label, and weight are used to change the penalty
+ for some classes (If the weight for a class is not changed, it is
+ set to 1). This is useful for training classifier using unbalanced
+ input data or with asymmetric misclassification cost.
+
+ nr_weight is the number of elements in the array weight_label and
+ weight. Each weight[i] corresponds to weight_label[i], meaning that
+ the penalty of class weight_label[i] is scaled by a factor of weight[i].
+
+ If you do not want to change penalty for any of the classes,
+ just set nr_weight to 0.
+
+ *NOTE* Because svm_model contains pointers to svm_problem, you can
+ not free the memory used by svm_problem if you are still using the
+ svm_model produced by svm_train().
+
+ *NOTE* To avoid wrong parameters, svm_check_parameter() should be
+ called before svm_train().
+
+ struct svm_model stores the model obtained from the training procedure.
+ It is not recommended to directly access entries in this structure.
+ Programmers should use the interface functions to get the values.
+
+ struct svm_model
+ {
+ struct svm_parameter param; /* parameter */
+ int nr_class; /* number of classes, = 2 in regression/one class svm */
+ int l; /* total #SV */
+ struct svm_node **SV; /* SVs (SV[l]) */
+ double **sv_coef; /* coefficients for SVs in decision functions (sv_coef[k-1][l]) */
+ double *rho; /* constants in decision functions (rho[k*(k-1)/2]) */
+ double *probA; /* pairwise probability information */
+ double *probB;
+ int *sv_indices; /* sv_indices[0,...,nSV-1] are values in [1,...,num_traning_data] to indicate SVs in the training set */
+
+ /* for classification only */
+
+ int *label; /* label of each class (label[k]) */
+ int *nSV; /* number of SVs for each class (nSV[k]) */
+ /* nSV[0] + nSV[1] + ... + nSV[k-1] = l */
+ /* XXX */
+ int free_sv; /* 1 if svm_model is created by svm_load_model*/
+ /* 0 if svm_model is created by svm_train */
+ };
+
+ param describes the parameters used to obtain the model.
+
+ nr_class is the number of classes. It is 2 for regression and one-class SVM.
+
+ l is the number of support vectors. SV and sv_coef are support
+ vectors and the corresponding coefficients, respectively. Assume there are
+ k classes. For data in class j, the corresponding sv_coef includes (k-1) y*alpha vectors,
+ where alpha's are solutions of the following two class problems:
+ 1 vs j, 2 vs j, ..., j-1 vs j, j vs j+1, j vs j+2, ..., j vs k
+ and y=1 for the first j-1 vectors, while y=-1 for the remaining k-j
+ vectors. For example, if there are 4 classes, sv_coef and SV are like:
+
+ +-+-+-+--------------------+
+ |1|1|1| |
+ |v|v|v| SVs from class 1 |
+ |2|3|4| |
+ +-+-+-+--------------------+
+ |1|2|2| |
+ |v|v|v| SVs from class 2 |
+ |2|3|4| |
+ +-+-+-+--------------------+
+ |1|2|3| |
+ |v|v|v| SVs from class 3 |
+ |3|3|4| |
+ +-+-+-+--------------------+
+ |1|2|3| |
+ |v|v|v| SVs from class 4 |
+ |4|4|4| |
+ +-+-+-+--------------------+
+
+ See svm_train() for an example of assigning values to sv_coef.
+
+ rho is the bias term (-b). probA and probB are parameters used in
+ probability outputs. If there are k classes, there are k*(k-1)/2
+ binary problems as well as rho, probA, and probB values. They are
+ aligned in the order of binary problems:
+ 1 vs 2, 1 vs 3, ..., 1 vs k, 2 vs 3, ..., 2 vs k, ..., k-1 vs k.
+
+ sv_indices[0,...,nSV-1] are values in [1,...,num_traning_data] to
+ indicate support vectors in the training set.
+
+ label contains labels in the training data.
+
+ nSV is the number of support vectors in each class.
+
+ free_sv is a flag used to determine whether the space of SV should
+ be released in free_model_content(struct svm_model*) and
+ free_and_destroy_model(struct svm_model**). If the model is
+ generated by svm_train(), then SV points to data in svm_problem
+ and should not be removed. For example, free_sv is 0 if svm_model
+ is created by svm_train, but is 1 if created by svm_load_model.
+
+- Function: double svm_predict(const struct svm_model *model,
+ const struct svm_node *x);
+
+ This function does classification or regression on a test vector x
+ given a model.
+
+ For a classification model, the predicted class for x is returned.
+ For a regression model, the function value of x calculated using
+ the model is returned. For an one-class model, +1 or -1 is
+ returned.
+
+- Function: void svm_cross_validation(const struct svm_problem *prob,
+ const struct svm_parameter *param, int nr_fold, double *target);
+
+ This function conducts cross validation. Data are separated to
+ nr_fold folds. Under given parameters, sequentially each fold is
+ validated using the model from training the remaining. Predicted
+ labels (of all prob's instances) in the validation process are
+ stored in the array called target.
+
+ The format of svm_prob is same as that for svm_train().
+
+- Function: int svm_get_svm_type(const struct svm_model *model);
+
+ This function gives svm_type of the model. Possible values of
+ svm_type are defined in svm.h.
+
+- Function: int svm_get_nr_class(const svm_model *model);
+
+ For a classification model, this function gives the number of
+ classes. For a regression or an one-class model, 2 is returned.
+
+- Function: void svm_get_labels(const svm_model *model, int* label)
+
+ For a classification model, this function outputs the name of
+ labels into an array called label. For regression and one-class
+ models, label is unchanged.
+
+- Function: void svm_get_sv_indices(const struct svm_model *model, int *sv_indices)
+
+ This function outputs indices of support vectors into an array called sv_indices.
+ The size of sv_indices is the number of support vectors and can be obtained by calling svm_get_nr_sv.
+ Each sv_indices[i] is in the range of [1, ..., num_traning_data].
+
+- Function: int svm_get_nr_sv(const struct svm_model *model)
+
+ This function gives the number of total support vector.
+
+- Function: double svm_get_svr_probability(const struct svm_model *model);
+
+ For a regression model with probability information, this function
+ outputs a value sigma > 0. For test data, we consider the
+ probability model: target value = predicted value + z, z: Laplace
+ distribution e^(-|z|/sigma)/(2sigma)
+
+ If the model is not for svr or does not contain required
+ information, 0 is returned.
+
+- Function: double svm_predict_values(const svm_model *model,
+ const svm_node *x, double* dec_values)
+
+ This function gives decision values on a test vector x given a
+ model, and return the predicted label (classification) or
+ the function value (regression).
+
+ For a classification model with nr_class classes, this function
+ gives nr_class*(nr_class-1)/2 decision values in the array
+ dec_values, where nr_class can be obtained from the function
+ svm_get_nr_class. The order is label[0] vs. label[1], ...,
+ label[0] vs. label[nr_class-1], label[1] vs. label[2], ...,
+ label[nr_class-2] vs. label[nr_class-1], where label can be
+ obtained from the function svm_get_labels. The returned value is
+ the predicted class for x. Note that when nr_class = 1, this
+ function does not give any decision value.
+
+ For a regression model, dec_values[0] and the returned value are
+ both the function value of x calculated using the model. For a
+ one-class model, dec_values[0] is the decision value of x, while
+ the returned value is +1/-1.
+
+- Function: double svm_predict_probability(const struct svm_model *model,
+ const struct svm_node *x, double* prob_estimates);
+
+ This function does classification or regression on a test vector x
+ given a model with probability information.
+
+ For a classification model with probability information, this
+ function gives nr_class probability estimates in the array
+ prob_estimates. nr_class can be obtained from the function
+ svm_get_nr_class. The class with the highest probability is
+ returned. For regression/one-class SVM, the array prob_estimates
+ is unchanged and the returned value is the same as that of
+ svm_predict.
+
+- Function: const char *svm_check_parameter(const struct svm_problem *prob,
+ const struct svm_parameter *param);
+
+ This function checks whether the parameters are within the feasible
+ range of the problem. This function should be called before calling
+ svm_train() and svm_cross_validation(). It returns NULL if the
+ parameters are feasible, otherwise an error message is returned.
+
+- Function: int svm_check_probability_model(const struct svm_model *model);
+
+ This function checks whether the model contains required
+ information to do probability estimates. If so, it returns
+ +1. Otherwise, 0 is returned. This function should be called
+ before calling svm_get_svr_probability and
+ svm_predict_probability.
+
+- Function: int svm_save_model(const char *model_file_name,
+ const struct svm_model *model);
+
+ This function saves a model to a file; returns 0 on success, or -1
+ if an error occurs.
+
+- Function: struct svm_model *svm_load_model(const char *model_file_name);
+
+ This function returns a pointer to the model read from the file,
+ or a null pointer if the model could not be loaded.
+
+- Function: void svm_free_model_content(struct svm_model *model_ptr);
+
+ This function frees the memory used by the entries in a model structure.
+
+- Function: void svm_free_and_destroy_model(struct svm_model **model_ptr_ptr);
+
+ This function frees the memory used by a model and destroys the model
+ structure. It is equivalent to svm_destroy_model, which
+ is deprecated after version 3.0.
+
+- Function: void svm_destroy_param(struct svm_parameter *param);
+
+ This function frees the memory used by a parameter set.
+
+- Function: void svm_set_print_string_function(void (*print_func)(const char *));
+
+ Users can specify their output format by a function. Use
+ svm_set_print_string_function(NULL);
+ for default printing to stdout.
+
+Java Version
+============
+
+The pre-compiled java class archive `libsvm.jar' and its source files are
+in the java directory. To run the programs, use
+
+java -classpath libsvm.jar svm_train
+java -classpath libsvm.jar svm_predict
+java -classpath libsvm.jar svm_toy
+java -classpath libsvm.jar svm_scale
+
+Note that you need Java 1.5 (5.0) or above to run it.
+
+You may need to add Java runtime library (like classes.zip) to the classpath.
+You may need to increase maximum Java heap size.
+
+Library usages are similar to the C version. These functions are available:
+
+public class svm {
+ public static final int LIBSVM_VERSION=322;
+ public static svm_model svm_train(svm_problem prob, svm_parameter param);
+ public static void svm_cross_validation(svm_problem prob, svm_parameter param, int nr_fold, double[] target);
+ public static int svm_get_svm_type(svm_model model);
+ public static int svm_get_nr_class(svm_model model);
+ public static void svm_get_labels(svm_model model, int[] label);
+ public static void svm_get_sv_indices(svm_model model, int[] indices);
+ public static int svm_get_nr_sv(svm_model model);
+ public static double svm_get_svr_probability(svm_model model);
+ public static double svm_predict_values(svm_model model, svm_node[] x, double[] dec_values);
+ public static double svm_predict(svm_model model, svm_node[] x);
+ public static double svm_predict_probability(svm_model model, svm_node[] x, double[] prob_estimates);
+ public static void svm_save_model(String model_file_name, svm_model model) throws IOException
+ public static svm_model svm_load_model(String model_file_name) throws IOException
+ public static String svm_check_parameter(svm_problem prob, svm_parameter param);
+ public static int svm_check_probability_model(svm_model model);
+ public static void svm_set_print_string_function(svm_print_interface print_func);
+}
+
+The library is in the "libsvm" package.
+Note that in Java version, svm_node[] is not ended with a node whose index = -1.
+
+Users can specify their output format by
+
+ your_print_func = new svm_print_interface()
+ {
+ public void print(String s)
+ {
+ // your own format
+ }
+ };
+ svm.svm_set_print_string_function(your_print_func);
+
+Building Windows Binaries
+=========================
+
+Windows binaries are available in the directory `windows'. To re-build
+them via Visual C++, use the following steps:
+
+1. Open a DOS command box (or Visual Studio Command Prompt) and change
+to libsvm directory. If environment variables of VC++ have not been
+set, type
+
+""C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\bin\amd64\vcvars64.bat""
+
+You may have to modify the above command according which version of
+VC++ or where it is installed.
+
+2. Type
+
+nmake -f Makefile.win clean all
+
+3. (optional) To build shared library libsvm.dll, type
+
+nmake -f Makefile.win lib
+
+4. (optional) To build 32-bit windows binaries, you must
+ (1) Setup "C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\bin\vcvars32.bat" instead of vcvars64.bat
+ (2) Change CFLAGS in Makefile.win: /D _WIN64 to /D _WIN32
+
+Another way is to build them from Visual C++ environment. See details
+in libsvm FAQ.
+
+- Additional Tools: Sub-sampling, Parameter Selection, Format checking, etc.
+============================================================================
+
+See the README file in the tools directory.
+
+MATLAB/OCTAVE Interface
+=======================
+
+Please check the file README in the directory `matlab'.
+
+Python Interface
+================
+
+See the README file in python directory.
+
+Additional Information
+======================
+
+If you find LIBSVM helpful, please cite it as
+
+Chih-Chung Chang and Chih-Jen Lin, LIBSVM : a library for support
+vector machines. ACM Transactions on Intelligent Systems and
+Technology, 2:27:1--27:27, 2011. Software available at
+http://www.csie.ntu.edu.tw/~cjlin/libsvm
+
+LIBSVM implementation document is available at
+http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf
+
+For any questions and comments, please email cjlin@csie.ntu.edu.tw
+
+Acknowledgments:
+This work was supported in part by the National Science
+Council of Taiwan via the grant NSC 89-2213-E-002-013.
+The authors thank their group members and users
+for many helpful discussions and comments. They are listed in
+http://www.csie.ntu.edu.tw/~cjlin/libsvm/acknowledgements
+
diff --git a/src/backend/app/algorithms/evaluate/libsvm/__init__.py b/src/backend/app/algorithms/evaluate/libsvm/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/src/backend/app/algorithms/evaluate/libsvm/heart_scale b/src/backend/app/algorithms/evaluate/libsvm/heart_scale
new file mode 100644
index 0000000..23bac94
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/heart_scale
@@ -0,0 +1,270 @@
++1 1:0.708333 2:1 3:1 4:-0.320755 5:-0.105023 6:-1 7:1 8:-0.419847 9:-1 10:-0.225806 12:1 13:-1
+-1 1:0.583333 2:-1 3:0.333333 4:-0.603774 5:1 6:-1 7:1 8:0.358779 9:-1 10:-0.483871 12:-1 13:1
++1 1:0.166667 2:1 3:-0.333333 4:-0.433962 5:-0.383562 6:-1 7:-1 8:0.0687023 9:-1 10:-0.903226 11:-1 12:-1 13:1
+-1 1:0.458333 2:1 3:1 4:-0.358491 5:-0.374429 6:-1 7:-1 8:-0.480916 9:1 10:-0.935484 12:-0.333333 13:1
+-1 1:0.875 2:-1 3:-0.333333 4:-0.509434 5:-0.347032 6:-1 7:1 8:-0.236641 9:1 10:-0.935484 11:-1 12:-0.333333 13:-1
+-1 1:0.5 2:1 3:1 4:-0.509434 5:-0.767123 6:-1 7:-1 8:0.0534351 9:-1 10:-0.870968 11:-1 12:-1 13:1
++1 1:0.125 2:1 3:0.333333 4:-0.320755 5:-0.406393 6:1 7:1 8:0.0839695 9:1 10:-0.806452 12:-0.333333 13:0.5
++1 1:0.25 2:1 3:1 4:-0.698113 5:-0.484018 6:-1 7:1 8:0.0839695 9:1 10:-0.612903 12:-0.333333 13:1
++1 1:0.291667 2:1 3:1 4:-0.132075 5:-0.237443 6:-1 7:1 8:0.51145 9:-1 10:-0.612903 12:0.333333 13:1
++1 1:0.416667 2:-1 3:1 4:0.0566038 5:0.283105 6:-1 7:1 8:0.267176 9:-1 10:0.290323 12:1 13:1
+-1 1:0.25 2:1 3:1 4:-0.226415 5:-0.506849 6:-1 7:-1 8:0.374046 9:-1 10:-0.83871 12:-1 13:1
+-1 2:1 3:1 4:-0.0943396 5:-0.543379 6:-1 7:1 8:-0.389313 9:1 10:-1 11:-1 12:-1 13:1
+-1 1:-0.375 2:1 3:0.333333 4:-0.132075 5:-0.502283 6:-1 7:1 8:0.664122 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.333333 2:1 3:-1 4:-0.245283 5:-0.506849 6:-1 7:-1 8:0.129771 9:-1 10:-0.16129 12:0.333333 13:-1
+-1 1:0.166667 2:-1 3:1 4:-0.358491 5:-0.191781 6:-1 7:1 8:0.343511 9:-1 10:-1 11:-1 12:-0.333333 13:-1
+-1 1:0.75 2:-1 3:1 4:-0.660377 5:-0.894977 6:-1 7:-1 8:-0.175573 9:-1 10:-0.483871 12:-1 13:-1
++1 1:-0.291667 2:1 3:1 4:-0.132075 5:-0.155251 6:-1 7:-1 8:-0.251908 9:1 10:-0.419355 12:0.333333 13:1
++1 2:1 3:1 4:-0.132075 5:-0.648402 6:1 7:1 8:0.282443 9:1 11:1 12:-1 13:1
+-1 1:0.458333 2:1 3:-1 4:-0.698113 5:-0.611872 6:-1 7:1 8:0.114504 9:1 10:-0.419355 12:-1 13:-1
+-1 1:-0.541667 2:1 3:-1 4:-0.132075 5:-0.666667 6:-1 7:-1 8:0.633588 9:1 10:-0.548387 11:-1 12:-1 13:1
++1 1:0.583333 2:1 3:1 4:-0.509434 5:-0.52968 6:-1 7:1 8:-0.114504 9:1 10:-0.16129 12:0.333333 13:1
+-1 1:-0.208333 2:1 3:-0.333333 4:-0.320755 5:-0.456621 6:-1 7:1 8:0.664122 9:-1 10:-0.935484 12:-1 13:-1
+-1 1:-0.416667 2:1 3:1 4:-0.603774 5:-0.191781 6:-1 7:-1 8:0.679389 9:-1 10:-0.612903 12:-1 13:-1
+-1 1:-0.25 2:1 3:1 4:-0.660377 5:-0.643836 6:-1 7:-1 8:0.0992366 9:-1 10:-0.967742 11:-1 12:-1 13:-1
+-1 1:0.0416667 2:-1 3:-0.333333 4:-0.283019 5:-0.260274 6:1 7:1 8:0.343511 9:1 10:-1 11:-1 12:-0.333333 13:-1
+-1 1:-0.208333 2:-1 3:0.333333 4:-0.320755 5:-0.319635 6:-1 7:-1 8:0.0381679 9:-1 10:-0.935484 11:-1 12:-1 13:-1
+-1 1:-0.291667 2:-1 3:1 4:-0.169811 5:-0.465753 6:-1 7:1 8:0.236641 9:1 10:-1 12:-1 13:-1
+-1 1:-0.0833333 2:-1 3:0.333333 4:-0.509434 5:-0.228311 6:-1 7:1 8:0.312977 9:-1 10:-0.806452 11:-1 12:-1 13:-1
++1 1:0.208333 2:1 3:0.333333 4:-0.660377 5:-0.525114 6:-1 7:1 8:0.435115 9:-1 10:-0.193548 12:-0.333333 13:1
+-1 1:0.75 2:-1 3:0.333333 4:-0.698113 5:-0.365297 6:1 7:1 8:-0.0992366 9:-1 10:-1 11:-1 12:-0.333333 13:-1
++1 1:0.166667 2:1 3:0.333333 4:-0.358491 5:-0.52968 6:-1 7:1 8:0.206107 9:-1 10:-0.870968 12:-0.333333 13:1
+-1 1:0.541667 2:1 3:1 4:0.245283 5:-0.534247 6:-1 7:1 8:0.0229008 9:-1 10:-0.258065 11:-1 12:-1 13:0.5
+-1 1:-0.666667 2:-1 3:0.333333 4:-0.509434 5:-0.593607 6:-1 7:-1 8:0.51145 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.25 2:1 3:1 4:0.433962 5:-0.086758 6:-1 7:1 8:0.0534351 9:1 10:0.0967742 11:1 12:-1 13:1
++1 1:-0.125 2:1 3:1 4:-0.0566038 5:-0.6621 6:-1 7:1 8:-0.160305 9:1 10:-0.709677 12:-1 13:1
++1 1:-0.208333 2:1 3:1 4:-0.320755 5:-0.406393 6:1 7:1 8:0.206107 9:1 10:-1 11:-1 12:0.333333 13:1
++1 1:0.333333 2:1 3:1 4:-0.132075 5:-0.630137 6:-1 7:1 8:0.0229008 9:1 10:-0.387097 11:-1 12:-0.333333 13:1
++1 1:0.25 2:1 3:-1 4:0.245283 5:-0.328767 6:-1 7:1 8:-0.175573 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.458333 2:1 3:0.333333 4:-0.320755 5:-0.753425 6:-1 7:-1 8:0.206107 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.208333 2:1 3:1 4:-0.471698 5:-0.561644 6:-1 7:1 8:0.755725 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:-0.541667 2:1 3:1 4:0.0943396 5:-0.557078 6:-1 7:-1 8:0.679389 9:-1 10:-1 11:-1 12:-1 13:1
+-1 1:0.375 2:-1 3:1 4:-0.433962 5:-0.621005 6:-1 7:-1 8:0.40458 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.375 2:1 3:0.333333 4:-0.320755 5:-0.511416 6:-1 7:-1 8:0.648855 9:1 10:-0.870968 11:-1 12:-1 13:-1
+-1 1:-0.291667 2:1 3:-0.333333 4:-0.867925 5:-0.675799 6:1 7:-1 8:0.29771 9:-1 10:-1 11:-1 12:-1 13:1
++1 1:0.25 2:1 3:0.333333 4:-0.396226 5:-0.579909 6:1 7:-1 8:-0.0381679 9:-1 10:-0.290323 12:-0.333333 13:0.5
+-1 1:0.208333 2:1 3:0.333333 4:-0.132075 5:-0.611872 6:1 7:1 8:0.435115 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:-0.166667 2:1 3:0.333333 4:-0.54717 5:-0.894977 6:-1 7:1 8:-0.160305 9:-1 10:-0.741935 11:-1 12:1 13:-1
++1 1:-0.375 2:1 3:1 4:-0.698113 5:-0.675799 6:-1 7:1 8:0.618321 9:-1 10:-1 11:-1 12:-0.333333 13:-1
++1 1:0.541667 2:1 3:-0.333333 4:0.245283 5:-0.452055 6:-1 7:-1 8:-0.251908 9:1 10:-1 12:1 13:0.5
++1 1:0.5 2:-1 3:1 4:0.0566038 5:-0.547945 6:-1 7:1 8:-0.343511 9:-1 10:-0.677419 12:1 13:1
++1 1:-0.458333 2:1 3:1 4:-0.207547 5:-0.136986 6:-1 7:-1 8:-0.175573 9:1 10:-0.419355 12:-1 13:0.5
+-1 1:-0.0416667 2:1 3:-0.333333 4:-0.358491 5:-0.639269 6:1 7:-1 8:0.725191 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:0.5 2:-1 3:0.333333 4:-0.132075 5:0.328767 6:1 7:1 8:0.312977 9:-1 10:-0.741935 11:-1 12:-0.333333 13:-1
+-1 1:0.416667 2:-1 3:-0.333333 4:-0.132075 5:-0.684932 6:-1 7:-1 8:0.648855 9:-1 10:-1 11:-1 12:0.333333 13:-1
+-1 1:-0.333333 2:-1 3:-0.333333 4:-0.320755 5:-0.506849 6:-1 7:1 8:0.587786 9:-1 10:-0.806452 12:-1 13:-1
+-1 1:-0.5 2:-1 3:-0.333333 4:-0.792453 5:-0.671233 6:-1 7:-1 8:0.480916 9:-1 10:-1 11:-1 12:-0.333333 13:-1
++1 1:0.333333 2:1 3:1 4:-0.169811 5:-0.817352 6:-1 7:1 8:-0.175573 9:1 10:0.16129 12:-0.333333 13:-1
+-1 1:0.291667 2:-1 3:0.333333 4:-0.509434 5:-0.762557 6:1 7:-1 8:-0.618321 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.25 2:-1 3:1 4:0.509434 5:-0.438356 6:-1 7:-1 8:0.0992366 9:1 10:-1 12:-1 13:-1
++1 1:0.375 2:1 3:-0.333333 4:-0.509434 5:-0.292237 6:-1 7:1 8:-0.51145 9:-1 10:-0.548387 12:-0.333333 13:1
+-1 1:0.166667 2:1 3:0.333333 4:0.0566038 5:-1 6:1 7:-1 8:0.557252 9:-1 10:-0.935484 11:-1 12:-0.333333 13:1
++1 1:-0.0833333 2:-1 3:1 4:-0.320755 5:-0.182648 6:-1 7:-1 8:0.0839695 9:1 10:-0.612903 12:-1 13:1
+-1 1:-0.375 2:1 3:0.333333 4:-0.509434 5:-0.543379 6:-1 7:-1 8:0.496183 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:0.291667 2:-1 3:-1 4:0.0566038 5:-0.479452 6:-1 7:-1 8:0.526718 9:-1 10:-0.709677 11:-1 12:-1 13:-1
+-1 1:0.416667 2:1 3:-1 4:-0.0377358 5:-0.511416 6:1 7:1 8:0.206107 9:-1 10:-0.258065 11:1 12:-1 13:0.5
++1 1:0.166667 2:1 3:1 4:0.0566038 5:-0.315068 6:-1 7:1 8:-0.374046 9:1 10:-0.806452 12:-0.333333 13:0.5
+-1 1:-0.0833333 2:1 3:1 4:-0.132075 5:-0.383562 6:-1 7:1 8:0.755725 9:1 10:-1 11:-1 12:-1 13:-1
++1 1:0.208333 2:-1 3:-0.333333 4:-0.207547 5:-0.118721 6:1 7:1 8:0.236641 9:-1 10:-1 11:-1 12:0.333333 13:-1
+-1 1:-0.375 2:-1 3:0.333333 4:-0.54717 5:-0.47032 6:-1 7:-1 8:0.19084 9:-1 10:-0.903226 12:-0.333333 13:-1
++1 1:-0.25 2:1 3:0.333333 4:-0.735849 5:-0.465753 6:-1 7:-1 8:0.236641 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.333333 2:1 3:1 4:-0.509434 5:-0.388128 6:-1 7:-1 8:0.0534351 9:1 10:0.16129 12:-0.333333 13:1
+-1 1:0.166667 2:-1 3:1 4:-0.509434 5:0.0410959 6:-1 7:-1 8:0.40458 9:1 10:-0.806452 11:-1 12:-1 13:-1
+-1 1:0.708333 2:1 3:-0.333333 4:0.169811 5:-0.456621 6:-1 7:1 8:0.0992366 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:0.958333 2:-1 3:0.333333 4:-0.132075 5:-0.675799 6:-1 8:-0.312977 9:-1 10:-0.645161 12:-1 13:-1
+-1 1:0.583333 2:-1 3:1 4:-0.773585 5:-0.557078 6:-1 7:-1 8:0.0839695 9:-1 10:-0.903226 11:-1 12:0.333333 13:-1
++1 1:-0.333333 2:1 3:1 4:-0.0943396 5:-0.164384 6:-1 7:1 8:0.160305 9:1 10:-1 12:1 13:1
+-1 1:-0.333333 2:1 3:1 4:-0.811321 5:-0.625571 6:-1 7:1 8:0.175573 9:1 10:-0.0322581 12:-1 13:-1
+-1 1:-0.583333 2:-1 3:0.333333 4:-1 5:-0.666667 6:-1 7:-1 8:0.648855 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.458333 2:-1 3:0.333333 4:-0.509434 5:-0.621005 6:-1 7:-1 8:0.557252 9:-1 10:-1 12:-1 13:-1
+-1 1:0.125 2:1 3:-0.333333 4:-0.509434 5:-0.497717 6:-1 7:-1 8:0.633588 9:-1 10:-0.741935 11:-1 12:-1 13:-1
++1 1:0.208333 2:1 3:1 4:-0.0188679 5:-0.579909 6:-1 7:-1 8:-0.480916 9:-1 10:-0.354839 12:-0.333333 13:1
++1 1:-0.75 2:1 3:1 4:-0.509434 5:-0.671233 6:-1 7:-1 8:-0.0992366 9:1 10:-0.483871 12:-1 13:1
++1 1:0.208333 2:1 3:1 4:0.0566038 5:-0.342466 6:-1 7:1 8:-0.389313 9:1 10:-0.741935 11:-1 12:-1 13:1
+-1 1:-0.5 2:1 3:0.333333 4:-0.320755 5:-0.598174 6:-1 7:1 8:0.480916 9:-1 10:-0.354839 12:-1 13:-1
+-1 1:0.166667 2:1 3:1 4:-0.698113 5:-0.657534 6:-1 7:-1 8:-0.160305 9:1 10:-0.516129 12:-1 13:0.5
+-1 1:-0.458333 2:1 3:-1 4:0.0188679 5:-0.461187 6:-1 7:1 8:0.633588 9:-1 10:-0.741935 11:-1 12:0.333333 13:-1
+-1 1:0.375 2:1 3:-0.333333 4:-0.358491 5:-0.625571 6:1 7:1 8:0.0534351 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:0.25 2:1 3:-1 4:0.584906 5:-0.342466 6:-1 7:1 8:0.129771 9:-1 10:0.354839 11:1 12:-1 13:1
+-1 1:-0.5 2:-1 3:-0.333333 4:-0.396226 5:-0.178082 6:-1 7:-1 8:0.40458 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:-0.125 2:1 3:1 4:0.0566038 5:-0.465753 6:-1 7:1 8:-0.129771 9:-1 10:-0.16129 12:-1 13:1
+-1 1:0.25 2:1 3:-0.333333 4:-0.132075 5:-0.56621 6:-1 7:-1 8:0.419847 9:1 10:-1 11:-1 12:-1 13:-1
++1 1:0.333333 2:-1 3:1 4:-0.320755 5:-0.0684932 6:-1 7:1 8:0.496183 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.0416667 2:1 3:1 4:-0.433962 5:-0.360731 6:-1 7:1 8:-0.419847 9:1 10:-0.290323 12:-0.333333 13:1
++1 1:0.0416667 2:1 3:1 4:-0.698113 5:-0.634703 6:-1 7:1 8:-0.435115 9:1 10:-1 12:-0.333333 13:-1
++1 1:-0.0416667 2:1 3:1 4:-0.415094 5:-0.607306 6:-1 7:-1 8:0.480916 9:-1 10:-0.677419 11:-1 12:0.333333 13:1
++1 1:-0.25 2:1 3:1 4:-0.698113 5:-0.319635 6:-1 7:1 8:-0.282443 9:1 10:-0.677419 12:-0.333333 13:-1
+-1 1:0.541667 2:1 3:1 4:-0.509434 5:-0.196347 6:-1 7:1 8:0.221374 9:-1 10:-0.870968 12:-1 13:-1
++1 1:0.208333 2:1 3:1 4:-0.886792 5:-0.506849 6:-1 7:-1 8:0.29771 9:-1 10:-0.967742 11:-1 12:-0.333333 13:1
+-1 1:0.458333 2:-1 3:0.333333 4:-0.132075 5:-0.146119 6:-1 7:-1 8:-0.0534351 9:-1 10:-0.935484 11:-1 12:-1 13:1
+-1 1:-0.125 2:-1 3:-0.333333 4:-0.509434 5:-0.461187 6:-1 7:-1 8:0.389313 9:-1 10:-0.645161 11:-1 12:-1 13:-1
+-1 1:-0.375 2:-1 3:0.333333 4:-0.735849 5:-0.931507 6:-1 7:-1 8:0.587786 9:-1 10:-0.806452 12:-1 13:-1
++1 1:0.583333 2:1 3:1 4:-0.509434 5:-0.493151 6:-1 7:-1 8:-1 9:-1 10:-0.677419 12:-1 13:-1
+-1 1:-0.166667 2:-1 3:1 4:-0.320755 5:-0.347032 6:-1 7:-1 8:0.40458 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.166667 2:1 3:1 4:0.339623 5:-0.255708 6:1 7:1 8:-0.19084 9:-1 10:-0.677419 12:1 13:1
++1 1:0.416667 2:1 3:1 4:-0.320755 5:-0.415525 6:-1 7:1 8:0.160305 9:-1 10:-0.548387 12:-0.333333 13:1
++1 1:-0.208333 2:1 3:1 4:-0.433962 5:-0.324201 6:-1 7:1 8:0.450382 9:-1 10:-0.83871 12:-1 13:1
+-1 1:-0.0833333 2:1 3:0.333333 4:-0.886792 5:-0.561644 6:-1 7:-1 8:0.0992366 9:1 10:-0.612903 12:-1 13:-1
++1 1:0.291667 2:-1 3:1 4:0.0566038 5:-0.39726 6:-1 7:1 8:0.312977 9:-1 10:-0.16129 12:0.333333 13:1
++1 1:0.25 2:1 3:1 4:-0.132075 5:-0.767123 6:-1 7:-1 8:0.389313 9:1 10:-1 11:-1 12:-0.333333 13:1
+-1 1:-0.333333 2:-1 3:-0.333333 4:-0.660377 5:-0.844749 6:-1 7:-1 8:0.0229008 9:-1 10:-1 12:-1 13:-1
++1 1:0.0833333 2:-1 3:1 4:0.622642 5:-0.0821918 6:-1 8:-0.29771 9:1 10:0.0967742 12:-1 13:-1
+-1 1:-0.5 2:1 3:-0.333333 4:-0.698113 5:-0.502283 6:-1 7:-1 8:0.251908 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.291667 2:-1 3:1 4:0.207547 5:-0.182648 6:-1 7:1 8:0.374046 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:0.0416667 2:-1 3:0.333333 4:-0.226415 5:-0.187215 6:1 7:-1 8:0.51145 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.458333 2:1 3:-0.333333 4:-0.509434 5:-0.228311 6:-1 7:-1 8:0.389313 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.166667 2:-1 3:-0.333333 4:-0.245283 5:-0.3379 6:-1 7:-1 8:0.389313 9:-1 10:-1 12:-1 13:-1
++1 1:-0.291667 2:1 3:1 4:-0.509434 5:-0.438356 6:-1 7:1 8:0.114504 9:-1 10:-0.741935 11:-1 12:-1 13:1
++1 1:0.125 2:-1 3:1 4:1 5:-0.260274 6:1 7:1 8:-0.0534351 9:1 10:0.290323 11:1 12:0.333333 13:1
+-1 1:0.541667 2:-1 3:-1 4:0.0566038 5:-0.543379 6:-1 7:-1 8:-0.343511 9:-1 10:-0.16129 11:1 12:-1 13:-1
++1 1:0.125 2:1 3:1 4:-0.320755 5:-0.283105 6:1 7:1 8:-0.51145 9:1 10:-0.483871 11:1 12:-1 13:1
++1 1:-0.166667 2:1 3:0.333333 4:-0.509434 5:-0.716895 6:-1 7:-1 8:0.0381679 9:-1 10:-0.354839 12:1 13:1
++1 1:0.0416667 2:1 3:1 4:-0.471698 5:-0.269406 6:-1 7:1 8:-0.312977 9:1 10:0.0322581 12:0.333333 13:-1
++1 1:0.166667 2:1 3:1 4:0.0943396 5:-0.324201 6:-1 7:-1 8:-0.740458 9:1 10:-0.612903 12:-0.333333 13:1
+-1 1:0.5 2:-1 3:0.333333 4:0.245283 5:0.0684932 6:-1 7:1 8:0.221374 9:-1 10:-0.741935 11:-1 12:-1 13:-1
+-1 1:0.0416667 2:1 3:0.333333 4:-0.415094 5:-0.328767 6:-1 7:1 8:0.236641 9:-1 10:-0.83871 11:1 12:-0.333333 13:-1
+-1 1:0.0416667 2:-1 3:0.333333 4:0.245283 5:-0.657534 6:-1 7:-1 8:0.40458 9:-1 10:-1 11:-1 12:-0.333333 13:-1
++1 1:0.375 2:1 3:1 4:-0.509434 5:-0.356164 6:-1 7:-1 8:-0.572519 9:1 10:-0.419355 12:0.333333 13:1
+-1 1:-0.0416667 2:-1 3:0.333333 4:-0.207547 5:-0.680365 6:-1 7:1 8:0.496183 9:-1 10:-0.967742 12:-1 13:-1
+-1 1:-0.0416667 2:1 3:-0.333333 4:-0.245283 5:-0.657534 6:-1 7:-1 8:0.328244 9:-1 10:-0.741935 11:-1 12:-0.333333 13:-1
++1 1:0.291667 2:1 3:1 4:-0.566038 5:-0.525114 6:1 7:-1 8:0.358779 9:1 10:-0.548387 11:-1 12:0.333333 13:1
++1 1:0.416667 2:-1 3:1 4:-0.735849 5:-0.347032 6:-1 7:-1 8:0.496183 9:1 10:-0.419355 12:0.333333 13:-1
++1 1:0.541667 2:1 3:1 4:-0.660377 5:-0.607306 6:-1 7:1 8:-0.0687023 9:1 10:-0.967742 11:-1 12:-0.333333 13:-1
+-1 1:-0.458333 2:1 3:1 4:-0.132075 5:-0.543379 6:-1 7:-1 8:0.633588 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.458333 2:1 3:1 4:-0.509434 5:-0.452055 6:-1 7:1 8:-0.618321 9:1 10:-0.290323 11:1 12:-0.333333 13:-1
+-1 1:0.0416667 2:1 3:0.333333 4:0.0566038 5:-0.515982 6:-1 7:1 8:0.435115 9:-1 10:-0.483871 11:-1 12:-1 13:1
+-1 1:-0.291667 2:-1 3:0.333333 4:-0.0943396 5:-0.767123 6:-1 7:1 8:0.358779 9:1 10:-0.548387 11:1 12:-1 13:-1
+-1 1:0.583333 2:-1 3:0.333333 4:0.0943396 5:-0.310502 6:-1 7:-1 8:0.541985 9:-1 10:-1 11:-1 12:-0.333333 13:-1
++1 1:0.125 2:1 3:1 4:-0.415094 5:-0.438356 6:1 7:1 8:0.114504 9:1 10:-0.612903 12:-0.333333 13:-1
+-1 1:-0.791667 2:-1 3:-0.333333 4:-0.54717 5:-0.616438 6:-1 7:-1 8:0.847328 9:-1 10:-0.774194 11:-1 12:-1 13:-1
+-1 1:0.166667 2:1 3:1 4:-0.283019 5:-0.630137 6:-1 7:-1 8:0.480916 9:1 10:-1 11:-1 12:-1 13:1
++1 1:0.458333 2:1 3:1 4:-0.0377358 5:-0.607306 6:-1 7:1 8:-0.0687023 9:-1 10:-0.354839 12:0.333333 13:0.5
+-1 1:0.25 2:1 3:1 4:-0.169811 5:-0.3379 6:-1 7:1 8:0.694656 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:-0.125 2:1 3:0.333333 4:-0.132075 5:-0.511416 6:-1 7:-1 8:0.40458 9:-1 10:-0.806452 12:-0.333333 13:1
+-1 1:-0.0833333 2:1 3:-1 4:-0.415094 5:-0.60274 6:-1 7:1 8:-0.175573 9:1 10:-0.548387 11:-1 12:-0.333333 13:-1
++1 1:0.0416667 2:1 3:-0.333333 4:0.849057 5:-0.283105 6:-1 7:1 8:0.89313 9:-1 10:-1 11:-1 12:-0.333333 13:1
++1 2:1 3:1 4:-0.45283 5:-0.287671 6:-1 7:-1 8:-0.633588 9:1 10:-0.354839 12:0.333333 13:1
++1 1:-0.0416667 2:1 3:1 4:-0.660377 5:-0.525114 6:-1 7:-1 8:0.358779 9:-1 10:-1 11:-1 12:-0.333333 13:-1
++1 1:-0.541667 2:1 3:1 4:-0.698113 5:-0.812785 6:-1 7:1 8:-0.343511 9:1 10:-0.354839 12:-1 13:1
++1 1:0.208333 2:1 3:0.333333 4:-0.283019 5:-0.552511 6:-1 7:1 8:0.557252 9:-1 10:0.0322581 11:-1 12:0.333333 13:1
+-1 1:-0.5 2:-1 3:0.333333 4:-0.660377 5:-0.351598 6:-1 7:1 8:0.541985 9:1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.5 2:1 3:0.333333 4:-0.660377 5:-0.43379 6:-1 7:-1 8:0.648855 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.125 2:-1 3:0.333333 4:-0.509434 5:-0.575342 6:-1 7:-1 8:0.328244 9:-1 10:-0.483871 12:-1 13:-1
+-1 1:0.0416667 2:-1 3:0.333333 4:-0.735849 5:-0.356164 6:-1 7:1 8:0.465649 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:0.458333 2:-1 3:1 4:-0.320755 5:-0.191781 6:-1 7:-1 8:-0.221374 9:-1 10:-0.354839 12:0.333333 13:-1
+-1 1:-0.0833333 2:-1 3:0.333333 4:-0.320755 5:-0.406393 6:-1 7:1 8:0.19084 9:-1 10:-0.83871 11:-1 12:-1 13:-1
+-1 1:-0.291667 2:-1 3:-0.333333 4:-0.792453 5:-0.643836 6:-1 7:-1 8:0.541985 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.0833333 2:1 3:1 4:-0.132075 5:-0.584475 6:-1 7:-1 8:-0.389313 9:1 10:0.806452 11:1 12:-1 13:1
+-1 1:-0.333333 2:1 3:-0.333333 4:-0.358491 5:-0.16895 6:-1 7:1 8:0.51145 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:0.125 2:1 3:-1 4:-0.509434 5:-0.694064 6:-1 7:1 8:0.389313 9:-1 10:-0.387097 12:-1 13:1
++1 1:0.541667 2:-1 3:1 4:0.584906 5:-0.534247 6:1 7:-1 8:0.435115 9:1 10:-0.677419 12:0.333333 13:1
++1 1:-0.625 2:1 3:-1 4:-0.509434 5:-0.520548 6:-1 7:-1 8:0.694656 9:1 10:0.225806 12:-1 13:1
++1 1:0.375 2:-1 3:1 4:0.0566038 5:-0.461187 6:-1 7:-1 8:0.267176 9:1 10:-0.548387 12:-1 13:-1
+-1 1:0.0833333 2:1 3:-0.333333 4:-0.320755 5:-0.378995 6:-1 7:-1 8:0.282443 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.208333 2:1 3:1 4:-0.358491 5:-0.392694 6:-1 7:1 8:-0.0992366 9:1 10:-0.0322581 12:0.333333 13:1
+-1 1:-0.416667 2:1 3:1 4:-0.698113 5:-0.611872 6:-1 7:-1 8:0.374046 9:-1 10:-1 11:-1 12:-1 13:1
+-1 1:0.458333 2:-1 3:1 4:0.622642 5:-0.0913242 6:-1 7:-1 8:0.267176 9:1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.125 2:-1 3:1 4:-0.698113 5:-0.415525 6:-1 7:1 8:0.343511 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 2:1 3:0.333333 4:-0.320755 5:-0.675799 6:1 7:1 8:0.236641 9:-1 10:-0.612903 11:1 12:-1 13:-1
+-1 1:-0.333333 2:-1 3:1 4:-0.169811 5:-0.497717 6:-1 7:1 8:0.236641 9:1 10:-0.935484 12:-1 13:-1
++1 1:0.5 2:1 3:-1 4:-0.169811 5:-0.287671 6:1 7:1 8:0.572519 9:-1 10:-0.548387 12:-0.333333 13:-1
+-1 1:0.666667 2:1 3:-1 4:0.245283 5:-0.506849 6:1 7:1 8:-0.0839695 9:-1 10:-0.967742 12:-0.333333 13:-1
++1 1:0.666667 2:1 3:0.333333 4:-0.132075 5:-0.415525 6:-1 7:1 8:0.145038 9:-1 10:-0.354839 12:1 13:1
++1 1:0.583333 2:1 3:1 4:-0.886792 5:-0.210046 6:-1 7:1 8:-0.175573 9:1 10:-0.709677 12:0.333333 13:-1
+-1 1:0.625 2:-1 3:0.333333 4:-0.509434 5:-0.611872 6:-1 7:1 8:-0.328244 9:-1 10:-0.516129 12:-1 13:-1
+-1 1:-0.791667 2:1 3:-1 4:-0.54717 5:-0.744292 6:-1 7:1 8:0.572519 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.375 2:-1 3:1 4:-0.169811 5:-0.232877 6:1 7:-1 8:-0.465649 9:-1 10:-0.387097 12:1 13:-1
++1 1:-0.0833333 2:1 3:1 4:-0.132075 5:-0.214612 6:-1 7:-1 8:-0.221374 9:1 10:0.354839 12:1 13:1
++1 1:-0.291667 2:1 3:0.333333 4:0.0566038 5:-0.520548 6:-1 7:-1 8:0.160305 9:-1 10:0.16129 12:-1 13:-1
++1 1:0.583333 2:1 3:1 4:-0.415094 5:-0.415525 6:1 7:-1 8:0.40458 9:-1 10:-0.935484 12:0.333333 13:1
+-1 1:-0.125 2:1 3:0.333333 4:-0.339623 5:-0.680365 6:-1 7:-1 8:0.40458 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.458333 2:1 3:0.333333 4:-0.509434 5:-0.479452 6:1 7:-1 8:0.877863 9:-1 10:-0.741935 11:1 12:-1 13:1
++1 1:0.125 2:-1 3:1 4:-0.245283 5:0.292237 6:-1 7:1 8:0.206107 9:1 10:-0.387097 12:0.333333 13:1
++1 1:-0.5 2:1 3:1 4:-0.698113 5:-0.789954 6:-1 7:1 8:0.328244 9:-1 10:-1 11:-1 12:-1 13:1
+-1 1:-0.458333 2:-1 3:1 4:-0.849057 5:-0.365297 6:-1 7:1 8:-0.221374 9:-1 10:-0.806452 12:-1 13:-1
+-1 2:1 3:0.333333 4:-0.320755 5:-0.452055 6:1 7:1 8:0.557252 9:-1 10:-1 11:-1 12:1 13:-1
+-1 1:-0.416667 2:1 3:0.333333 4:-0.320755 5:-0.136986 6:-1 7:-1 8:0.389313 9:-1 10:-0.387097 11:-1 12:-0.333333 13:-1
++1 1:0.125 2:1 3:1 4:-0.283019 5:-0.73516 6:-1 7:1 8:-0.480916 9:1 10:-0.322581 12:-0.333333 13:0.5
+-1 1:-0.0416667 2:1 3:1 4:-0.735849 5:-0.511416 6:1 7:-1 8:0.160305 9:-1 10:-0.967742 11:-1 12:1 13:1
+-1 1:0.375 2:-1 3:1 4:-0.132075 5:0.223744 6:-1 7:1 8:0.312977 9:-1 10:-0.612903 12:-1 13:-1
++1 1:0.708333 2:1 3:0.333333 4:0.245283 5:-0.347032 6:-1 7:-1 8:-0.374046 9:1 10:-0.0645161 12:-0.333333 13:1
+-1 1:0.0416667 2:1 3:1 4:-0.132075 5:-0.484018 6:-1 7:-1 8:0.358779 9:-1 10:-0.612903 11:-1 12:-1 13:-1
++1 1:0.708333 2:1 3:1 4:-0.0377358 5:-0.780822 6:-1 7:-1 8:-0.175573 9:1 10:-0.16129 11:1 12:-1 13:1
+-1 1:0.0416667 2:1 3:-0.333333 4:-0.735849 5:-0.164384 6:-1 7:-1 8:0.29771 9:-1 10:-1 11:-1 12:-1 13:1
++1 1:-0.75 2:1 3:1 4:-0.396226 5:-0.287671 6:-1 7:1 8:0.29771 9:1 10:-1 11:-1 12:-1 13:1
+-1 1:-0.208333 2:1 3:0.333333 4:-0.433962 5:-0.410959 6:1 7:-1 8:0.587786 9:-1 10:-1 11:-1 12:0.333333 13:-1
+-1 1:0.0833333 2:-1 3:-0.333333 4:-0.226415 5:-0.43379 6:-1 7:1 8:0.374046 9:-1 10:-0.548387 12:-1 13:-1
+-1 1:0.208333 2:-1 3:1 4:-0.886792 5:-0.442922 6:-1 7:1 8:-0.221374 9:-1 10:-0.677419 12:-1 13:-1
+-1 1:0.0416667 2:-1 3:0.333333 4:-0.698113 5:-0.598174 6:-1 7:-1 8:0.328244 9:-1 10:-0.483871 12:-1 13:-1
+-1 1:0.666667 2:-1 3:-1 4:-0.132075 5:-0.484018 6:-1 7:-1 8:0.221374 9:-1 10:-0.419355 11:-1 12:0.333333 13:-1
++1 1:1 2:1 3:1 4:-0.415094 5:-0.187215 6:-1 7:1 8:0.389313 9:1 10:-1 11:-1 12:1 13:-1
+-1 1:0.625 2:1 3:0.333333 4:-0.54717 5:-0.310502 6:-1 7:-1 8:0.221374 9:-1 10:-0.677419 11:-1 12:-0.333333 13:1
++1 1:0.208333 2:1 3:1 4:-0.415094 5:-0.205479 6:-1 7:1 8:0.526718 9:-1 10:-1 11:-1 12:0.333333 13:1
++1 1:0.291667 2:1 3:1 4:-0.415094 5:-0.39726 6:-1 7:1 8:0.0687023 9:1 10:-0.0967742 12:-0.333333 13:1
++1 1:-0.0833333 2:1 3:1 4:-0.132075 5:-0.210046 6:-1 7:-1 8:0.557252 9:1 10:-0.483871 11:-1 12:-1 13:1
++1 1:0.0833333 2:1 3:1 4:0.245283 5:-0.255708 6:-1 7:1 8:0.129771 9:1 10:-0.741935 12:-0.333333 13:1
+-1 1:-0.0416667 2:1 3:-1 4:0.0943396 5:-0.214612 6:1 7:-1 8:0.633588 9:-1 10:-0.612903 12:-1 13:1
+-1 1:0.291667 2:-1 3:0.333333 4:-0.849057 5:-0.123288 6:-1 7:-1 8:0.358779 9:-1 10:-1 11:-1 12:-0.333333 13:-1
+-1 1:0.208333 2:1 3:0.333333 4:-0.792453 5:-0.479452 6:-1 7:1 8:0.267176 9:1 10:-0.806452 12:-1 13:1
++1 1:0.458333 2:1 3:0.333333 4:-0.415094 5:-0.164384 6:-1 7:-1 8:-0.0839695 9:1 10:-0.419355 12:-1 13:1
+-1 1:-0.666667 2:1 3:0.333333 4:-0.320755 5:-0.43379 6:-1 7:-1 8:0.770992 9:-1 10:0.129032 11:1 12:-1 13:-1
++1 1:0.25 2:1 3:-1 4:0.433962 5:-0.260274 6:-1 7:1 8:0.343511 9:-1 10:-0.935484 12:-1 13:1
+-1 1:-0.0833333 2:1 3:0.333333 4:-0.415094 5:-0.456621 6:1 7:1 8:0.450382 9:-1 10:-0.225806 12:-1 13:-1
+-1 1:-0.416667 2:-1 3:0.333333 4:-0.471698 5:-0.60274 6:-1 7:-1 8:0.435115 9:-1 10:-0.935484 12:-1 13:-1
++1 1:0.208333 2:1 3:1 4:-0.358491 5:-0.589041 6:-1 7:1 8:-0.0839695 9:1 10:-0.290323 12:1 13:1
+-1 1:-1 2:1 3:-0.333333 4:-0.320755 5:-0.643836 6:-1 7:1 8:1 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.5 2:-1 3:-0.333333 4:-0.320755 5:-0.643836 6:-1 7:1 8:0.541985 9:-1 10:-0.548387 11:-1 12:-1 13:-1
+-1 1:0.416667 2:-1 3:0.333333 4:-0.226415 5:-0.424658 6:-1 7:1 8:0.541985 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.0833333 2:1 3:0.333333 4:-1 5:-0.538813 6:-1 7:-1 8:0.267176 9:1 10:-1 11:-1 12:-0.333333 13:1
+-1 1:0.0416667 2:1 3:0.333333 4:-0.509434 5:-0.39726 6:-1 7:1 8:0.160305 9:-1 10:-0.870968 12:-1 13:1
+-1 1:-0.375 2:1 3:-0.333333 4:-0.509434 5:-0.570776 6:-1 7:-1 8:0.51145 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.0416667 2:1 3:1 4:-0.698113 5:-0.484018 6:-1 7:-1 8:-0.160305 9:1 10:-0.0967742 12:-0.333333 13:1
++1 1:0.5 2:1 3:1 4:-0.226415 5:-0.415525 6:-1 7:1 8:-0.145038 9:-1 10:-0.0967742 12:-0.333333 13:1
+-1 1:0.166667 2:1 3:0.333333 4:0.0566038 5:-0.808219 6:-1 7:-1 8:0.572519 9:-1 10:-0.483871 11:-1 12:-1 13:-1
++1 1:0.416667 2:1 3:1 4:-0.320755 5:-0.0684932 6:1 7:1 8:-0.0687023 9:1 10:-0.419355 11:-1 12:1 13:1
+-1 1:-0.75 2:-1 3:1 4:-0.169811 5:-0.739726 6:-1 7:-1 8:0.694656 9:-1 10:-0.548387 11:-1 12:-1 13:-1
+-1 1:-0.5 2:1 3:-0.333333 4:-0.226415 5:-0.648402 6:-1 7:-1 8:-0.0687023 9:-1 10:-1 12:-1 13:0.5
++1 1:0.375 2:-1 3:0.333333 4:-0.320755 5:-0.374429 6:-1 7:-1 8:-0.603053 9:-1 10:-0.612903 12:-0.333333 13:1
++1 1:-0.416667 2:-1 3:1 4:-0.283019 5:-0.0182648 6:1 7:1 8:-0.00763359 9:1 10:-0.0322581 12:-1 13:1
+-1 1:0.208333 2:-1 3:-1 4:0.0566038 5:-0.283105 6:1 7:1 8:0.389313 9:-1 10:-0.677419 11:-1 12:-1 13:-1
+-1 1:-0.0416667 2:1 3:-1 4:-0.54717 5:-0.726027 6:-1 7:1 8:0.816794 9:-1 10:-1 12:-1 13:0.5
++1 1:0.333333 2:-1 3:1 4:-0.0377358 5:-0.173516 6:-1 7:1 8:0.145038 9:1 10:-0.677419 12:-1 13:1
++1 1:-0.583333 2:1 3:1 4:-0.54717 5:-0.575342 6:-1 7:-1 8:0.0534351 9:-1 10:-0.612903 12:-1 13:1
+-1 1:-0.333333 2:1 3:1 4:-0.603774 5:-0.388128 6:-1 7:1 8:0.740458 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:-0.0416667 2:1 3:1 4:-0.358491 5:-0.410959 6:-1 7:-1 8:0.374046 9:1 10:-1 11:-1 12:-0.333333 13:1
+-1 1:0.375 2:1 3:0.333333 4:-0.320755 5:-0.520548 6:-1 7:-1 8:0.145038 9:-1 10:-0.419355 12:1 13:1
++1 1:0.375 2:-1 3:1 4:0.245283 5:-0.826484 6:-1 7:1 8:0.129771 9:-1 10:1 11:1 12:1 13:1
+-1 2:-1 3:1 4:-0.169811 5:-0.506849 6:-1 7:1 8:0.358779 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:-0.416667 2:1 3:1 4:-0.509434 5:-0.767123 6:-1 7:1 8:-0.251908 9:1 10:-0.193548 12:-1 13:1
+-1 1:-0.25 2:1 3:0.333333 4:-0.169811 5:-0.401826 6:-1 7:1 8:0.29771 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.0416667 2:1 3:-0.333333 4:-0.509434 5:-0.0913242 6:-1 7:-1 8:0.541985 9:-1 10:-0.935484 11:-1 12:-1 13:-1
++1 1:0.625 2:1 3:0.333333 4:0.622642 5:-0.324201 6:1 7:1 8:0.206107 9:1 10:-0.483871 12:-1 13:1
+-1 1:-0.583333 2:1 3:0.333333 4:-0.132075 5:-0.109589 6:-1 7:1 8:0.694656 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 2:-1 3:1 4:-0.320755 5:-0.369863 6:-1 7:1 8:0.0992366 9:-1 10:-0.870968 12:-1 13:-1
++1 1:0.375 2:-1 3:1 4:-0.132075 5:-0.351598 6:-1 7:1 8:0.358779 9:-1 10:0.16129 11:1 12:0.333333 13:-1
+-1 1:-0.0833333 2:-1 3:0.333333 4:-0.132075 5:-0.16895 6:-1 7:1 8:0.0839695 9:-1 10:-0.516129 11:-1 12:-0.333333 13:-1
++1 1:0.291667 2:1 3:1 4:-0.320755 5:-0.420091 6:-1 7:-1 8:0.114504 9:1 10:-0.548387 11:-1 12:-0.333333 13:1
++1 1:0.5 2:1 3:1 4:-0.698113 5:-0.442922 6:-1 7:1 8:0.328244 9:-1 10:-0.806452 11:-1 12:0.333333 13:0.5
+-1 1:0.5 2:-1 3:0.333333 4:0.150943 5:-0.347032 6:-1 7:-1 8:0.175573 9:-1 10:-0.741935 11:-1 12:-1 13:-1
++1 1:0.291667 2:1 3:0.333333 4:-0.132075 5:-0.730594 6:-1 7:1 8:0.282443 9:-1 10:-0.0322581 12:-1 13:-1
++1 1:0.291667 2:1 3:1 4:-0.0377358 5:-0.287671 6:-1 7:1 8:0.0839695 9:1 10:-0.0967742 12:0.333333 13:1
++1 1:0.0416667 2:1 3:1 4:-0.509434 5:-0.716895 6:-1 7:-1 8:-0.358779 9:-1 10:-0.548387 12:-0.333333 13:1
+-1 1:-0.375 2:1 3:-0.333333 4:-0.320755 5:-0.575342 6:-1 7:1 8:0.78626 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:-0.375 2:1 3:1 4:-0.660377 5:-0.251142 6:-1 7:1 8:0.251908 9:-1 10:-1 11:-1 12:-0.333333 13:-1
+-1 1:-0.0833333 2:1 3:0.333333 4:-0.698113 5:-0.776256 6:-1 7:-1 8:-0.206107 9:-1 10:-0.806452 11:-1 12:-1 13:-1
+-1 1:0.25 2:1 3:0.333333 4:0.0566038 5:-0.607306 6:1 7:-1 8:0.312977 9:-1 10:-0.483871 11:-1 12:-1 13:-1
+-1 1:0.75 2:-1 3:-0.333333 4:0.245283 5:-0.196347 6:-1 7:-1 8:0.389313 9:-1 10:-0.870968 11:-1 12:0.333333 13:-1
+-1 1:0.333333 2:1 3:0.333333 4:0.0566038 5:-0.465753 6:1 7:-1 8:0.00763359 9:1 10:-0.677419 12:-1 13:-1
++1 1:0.0833333 2:1 3:1 4:-0.283019 5:0.0365297 6:-1 7:-1 8:-0.0687023 9:1 10:-0.612903 12:-0.333333 13:1
++1 1:0.458333 2:1 3:0.333333 4:-0.132075 5:-0.0456621 6:-1 7:-1 8:0.328244 9:-1 10:-1 11:-1 12:-1 13:-1
+-1 1:-0.416667 2:1 3:1 4:0.0566038 5:-0.447489 6:-1 7:-1 8:0.526718 9:-1 10:-0.516129 11:-1 12:-1 13:-1
+-1 1:0.208333 2:-1 3:0.333333 4:-0.509434 5:-0.0228311 6:-1 7:-1 8:0.541985 9:-1 10:-1 11:-1 12:-1 13:-1
++1 1:0.291667 2:1 3:1 4:-0.320755 5:-0.634703 6:-1 7:1 8:-0.0687023 9:1 10:-0.225806 12:0.333333 13:1
++1 1:0.208333 2:1 3:-0.333333 4:-0.509434 5:-0.278539 6:-1 7:1 8:0.358779 9:-1 10:-0.419355 12:-1 13:-1
+-1 1:-0.166667 2:1 3:-0.333333 4:-0.320755 5:-0.360731 6:-1 7:-1 8:0.526718 9:-1 10:-0.806452 11:-1 12:-1 13:-1
++1 1:-0.208333 2:1 3:-0.333333 4:-0.698113 5:-0.52968 6:-1 7:-1 8:0.480916 9:-1 10:-0.677419 11:1 12:-1 13:1
+-1 1:-0.0416667 2:1 3:0.333333 4:0.471698 5:-0.666667 6:1 7:-1 8:0.389313 9:-1 10:-0.83871 11:-1 12:-1 13:1
+-1 1:-0.375 2:1 3:-0.333333 4:-0.509434 5:-0.374429 6:-1 7:-1 8:0.557252 9:-1 10:-1 11:-1 12:-1 13:1
+-1 1:0.125 2:-1 3:-0.333333 4:-0.132075 5:-0.232877 6:-1 7:1 8:0.251908 9:-1 10:-0.580645 12:-1 13:-1
+-1 1:0.166667 2:1 3:1 4:-0.132075 5:-0.69863 6:-1 7:-1 8:0.175573 9:-1 10:-0.870968 12:-1 13:0.5
++1 1:0.583333 2:1 3:1 4:0.245283 5:-0.269406 6:-1 7:1 8:-0.435115 9:1 10:-0.516129 12:1 13:-1
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/Makefile b/src/backend/app/algorithms/evaluate/libsvm/java/Makefile
new file mode 100644
index 0000000..c84932a
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/Makefile
@@ -0,0 +1,26 @@
+.SUFFIXES: .class .java
+FILES = libsvm/svm.class libsvm/svm_model.class libsvm/svm_node.class \
+ libsvm/svm_parameter.class libsvm/svm_problem.class \
+ libsvm/svm_print_interface.class \
+ svm_train.class svm_predict.class svm_toy.class svm_scale.class
+
+#JAVAC = jikes
+JAVAC_FLAGS = -target 1.7 -source 1.7
+JAVAC = javac
+# JAVAC_FLAGS =
+export CLASSPATH := .:$(CLASSPATH)
+
+all: $(FILES)
+ jar cvf libsvm.jar *.class libsvm/*.class
+
+.java.class:
+ $(JAVAC) $(JAVAC_FLAGS) $<
+
+libsvm/svm.java: libsvm/svm.m4
+ m4 libsvm/svm.m4 > libsvm/svm.java
+
+clean:
+ rm -f libsvm/*.class *.class *.jar libsvm/*~ *~ libsvm/svm.java
+
+dist: clean all
+ rm *.class libsvm/*.class
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/libsvm.jar b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm.jar
new file mode 100644
index 0000000..1a9a124
Binary files /dev/null and b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm.jar differ
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm.java b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm.java
new file mode 100644
index 0000000..c027900
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm.java
@@ -0,0 +1,2860 @@
+
+
+
+
+
+package libsvm;
+import java.io.*;
+import java.util.*;
+
+//
+// Kernel Cache
+//
+// l is the number of total data items
+// size is the cache size limit in bytes
+//
+class Cache {
+ private final int l;
+ private long size;
+ private final class head_t
+ {
+ head_t prev, next; // a cicular list
+ float[] data;
+ int len; // data[0,len) is cached in this entry
+ }
+ private final head_t[] head;
+ private head_t lru_head;
+
+ Cache(int l_, long size_)
+ {
+ l = l_;
+ size = size_;
+ head = new head_t[l];
+ for(int i=0;i= len if nothing needs to be filled)
+ // java: simulate pointer using single-element array
+ int get_data(int index, float[][] data, int len)
+ {
+ head_t h = head[index];
+ if(h.len > 0) lru_delete(h);
+ int more = len - h.len;
+
+ if(more > 0)
+ {
+ // free old space
+ while(size < more)
+ {
+ head_t old = lru_head.next;
+ lru_delete(old);
+ size += old.len;
+ old.data = null;
+ old.len = 0;
+ }
+
+ // allocate new space
+ float[] new_data = new float[len];
+ if(h.data != null) System.arraycopy(h.data,0,new_data,0,h.len);
+ h.data = new_data;
+ size -= more;
+ do {int tmp=h.len; h.len=len; len=tmp;} while(false);
+ }
+
+ lru_insert(h);
+ data[0] = h.data;
+ return len;
+ }
+
+ void swap_index(int i, int j)
+ {
+ if(i==j) return;
+
+ if(head[i].len > 0) lru_delete(head[i]);
+ if(head[j].len > 0) lru_delete(head[j]);
+ do {float[] tmp=head[i].data; head[i].data=head[j].data; head[j].data=tmp;} while(false);
+ do {int tmp=head[i].len; head[i].len=head[j].len; head[j].len=tmp;} while(false);
+ if(head[i].len > 0) lru_insert(head[i]);
+ if(head[j].len > 0) lru_insert(head[j]);
+
+ if(i>j) do {int tmp=i; i=j; j=tmp;} while(false);
+ for(head_t h = lru_head.next; h!=lru_head; h=h.next)
+ {
+ if(h.len > i)
+ {
+ if(h.len > j)
+ do {float tmp=h.data[i]; h.data[i]=h.data[j]; h.data[j]=tmp;} while(false);
+ else
+ {
+ // give up
+ lru_delete(h);
+ size += h.len;
+ h.data = null;
+ h.len = 0;
+ }
+ }
+ }
+ }
+}
+
+//
+// Kernel evaluation
+//
+// the static method k_function is for doing single kernel evaluation
+// the constructor of Kernel prepares to calculate the l*l kernel matrix
+// the member function get_Q is for getting one column from the Q Matrix
+//
+abstract class QMatrix {
+ abstract float[] get_Q(int column, int len);
+ abstract double[] get_QD();
+ abstract void swap_index(int i, int j);
+};
+
+abstract class Kernel extends QMatrix {
+ private svm_node[][] x;
+ private final double[] x_square;
+
+ // svm_parameter
+ private final int kernel_type;
+ private final int degree;
+ private final double gamma;
+ private final double coef0;
+
+ abstract float[] get_Q(int column, int len);
+ abstract double[] get_QD();
+
+ void swap_index(int i, int j)
+ {
+ do {svm_node[] tmp=x[i]; x[i]=x[j]; x[j]=tmp;} while(false);
+ if(x_square != null) do {double tmp=x_square[i]; x_square[i]=x_square[j]; x_square[j]=tmp;} while(false);
+ }
+
+ private static double powi(double base, int times)
+ {
+ double tmp = base, ret = 1.0;
+
+ for(int t=times; t>0; t/=2)
+ {
+ if(t%2==1) ret*=tmp;
+ tmp = tmp * tmp;
+ }
+ return ret;
+ }
+
+ double kernel_function(int i, int j)
+ {
+ switch(kernel_type)
+ {
+ case svm_parameter.LINEAR:
+ return dot(x[i],x[j]);
+ case svm_parameter.POLY:
+ return powi(gamma*dot(x[i],x[j])+coef0,degree);
+ case svm_parameter.RBF:
+ return Math.exp(-gamma*(x_square[i]+x_square[j]-2*dot(x[i],x[j])));
+ case svm_parameter.SIGMOID:
+ return Math.tanh(gamma*dot(x[i],x[j])+coef0);
+ case svm_parameter.PRECOMPUTED:
+ return x[i][(int)(x[j][0].value)].value;
+ default:
+ return 0; // java
+ }
+ }
+
+ Kernel(int l, svm_node[][] x_, svm_parameter param)
+ {
+ this.kernel_type = param.kernel_type;
+ this.degree = param.degree;
+ this.gamma = param.gamma;
+ this.coef0 = param.coef0;
+
+ x = (svm_node[][])x_.clone();
+
+ if(kernel_type == svm_parameter.RBF)
+ {
+ x_square = new double[l];
+ for(int i=0;i y[j].index)
+ ++j;
+ else
+ ++i;
+ }
+ }
+ return sum;
+ }
+
+ static double k_function(svm_node[] x, svm_node[] y,
+ svm_parameter param)
+ {
+ switch(param.kernel_type)
+ {
+ case svm_parameter.LINEAR:
+ return dot(x,y);
+ case svm_parameter.POLY:
+ return powi(param.gamma*dot(x,y)+param.coef0,param.degree);
+ case svm_parameter.RBF:
+ {
+ double sum = 0;
+ int xlen = x.length;
+ int ylen = y.length;
+ int i = 0;
+ int j = 0;
+ while(i < xlen && j < ylen)
+ {
+ if(x[i].index == y[j].index)
+ {
+ double d = x[i++].value - y[j++].value;
+ sum += d*d;
+ }
+ else if(x[i].index > y[j].index)
+ {
+ sum += y[j].value * y[j].value;
+ ++j;
+ }
+ else
+ {
+ sum += x[i].value * x[i].value;
+ ++i;
+ }
+ }
+
+ while(i < xlen)
+ {
+ sum += x[i].value * x[i].value;
+ ++i;
+ }
+
+ while(j < ylen)
+ {
+ sum += y[j].value * y[j].value;
+ ++j;
+ }
+
+ return Math.exp(-param.gamma*sum);
+ }
+ case svm_parameter.SIGMOID:
+ return Math.tanh(param.gamma*dot(x,y)+param.coef0);
+ case svm_parameter.PRECOMPUTED:
+ return x[(int)(y[0].value)].value;
+ default:
+ return 0; // java
+ }
+ }
+}
+
+// An SMO algorithm in Fan et al., JMLR 6(2005), p. 1889--1918
+// Solves:
+//
+// min 0.5(\alpha^T Q \alpha) + p^T \alpha
+//
+// y^T \alpha = \delta
+// y_i = +1 or -1
+// 0 <= alpha_i <= Cp for y_i = 1
+// 0 <= alpha_i <= Cn for y_i = -1
+//
+// Given:
+//
+// Q, p, y, Cp, Cn, and an initial feasible point \alpha
+// l is the size of vectors and matrices
+// eps is the stopping tolerance
+//
+// solution will be put in \alpha, objective value will be put in obj
+//
+class Solver {
+ int active_size;
+ byte[] y;
+ double[] G; // gradient of objective function
+ static final byte LOWER_BOUND = 0;
+ static final byte UPPER_BOUND = 1;
+ static final byte FREE = 2;
+ byte[] alpha_status; // LOWER_BOUND, UPPER_BOUND, FREE
+ double[] alpha;
+ QMatrix Q;
+ double[] QD;
+ double eps;
+ double Cp,Cn;
+ double[] p;
+ int[] active_set;
+ double[] G_bar; // gradient, if we treat free variables as 0
+ int l;
+ boolean unshrink; // XXX
+
+ static final double INF = java.lang.Double.POSITIVE_INFINITY;
+
+ double get_C(int i)
+ {
+ return (y[i] > 0)? Cp : Cn;
+ }
+ void update_alpha_status(int i)
+ {
+ if(alpha[i] >= get_C(i))
+ alpha_status[i] = UPPER_BOUND;
+ else if(alpha[i] <= 0)
+ alpha_status[i] = LOWER_BOUND;
+ else alpha_status[i] = FREE;
+ }
+ boolean is_upper_bound(int i) { return alpha_status[i] == UPPER_BOUND; }
+ boolean is_lower_bound(int i) { return alpha_status[i] == LOWER_BOUND; }
+ boolean is_free(int i) { return alpha_status[i] == FREE; }
+
+ // java: information about solution except alpha,
+ // because we cannot return multiple values otherwise...
+ static class SolutionInfo {
+ double obj;
+ double rho;
+ double upper_bound_p;
+ double upper_bound_n;
+ double r; // for Solver_NU
+ }
+
+ void swap_index(int i, int j)
+ {
+ Q.swap_index(i,j);
+ do {byte tmp=y[i]; y[i]=y[j]; y[j]=tmp;} while(false);
+ do {double tmp=G[i]; G[i]=G[j]; G[j]=tmp;} while(false);
+ do {byte tmp=alpha_status[i]; alpha_status[i]=alpha_status[j]; alpha_status[j]=tmp;} while(false);
+ do {double tmp=alpha[i]; alpha[i]=alpha[j]; alpha[j]=tmp;} while(false);
+ do {double tmp=p[i]; p[i]=p[j]; p[j]=tmp;} while(false);
+ do {int tmp=active_set[i]; active_set[i]=active_set[j]; active_set[j]=tmp;} while(false);
+ do {double tmp=G_bar[i]; G_bar[i]=G_bar[j]; G_bar[j]=tmp;} while(false);
+ }
+
+ void reconstruct_gradient()
+ {
+ // reconstruct inactive elements of G from G_bar and free variables
+
+ if(active_size == l) return;
+
+ int i,j;
+ int nr_free = 0;
+
+ for(j=active_size;j 2*active_size*(l-active_size))
+ {
+ for(i=active_size;iInteger.MAX_VALUE/100 ? Integer.MAX_VALUE : 100*l);
+ int counter = Math.min(l,1000)+1;
+ int[] working_set = new int[2];
+
+ while(iter < max_iter)
+ {
+ // show progress and do shrinking
+
+ if(--counter == 0)
+ {
+ counter = Math.min(l,1000);
+ if(shrinking!=0) do_shrinking();
+ svm.info(".");
+ }
+
+ if(select_working_set(working_set)!=0)
+ {
+ // reconstruct the whole gradient
+ reconstruct_gradient();
+ // reset active set size and check
+ active_size = l;
+ svm.info("*");
+ if(select_working_set(working_set)!=0)
+ break;
+ else
+ counter = 1; // do shrinking next iteration
+ }
+
+ int i = working_set[0];
+ int j = working_set[1];
+
+ ++iter;
+
+ // update alpha[i] and alpha[j], handle bounds carefully
+
+ float[] Q_i = Q.get_Q(i,active_size);
+ float[] Q_j = Q.get_Q(j,active_size);
+
+ double C_i = get_C(i);
+ double C_j = get_C(j);
+
+ double old_alpha_i = alpha[i];
+ double old_alpha_j = alpha[j];
+
+ if(y[i]!=y[j])
+ {
+ double quad_coef = QD[i]+QD[j]+2*Q_i[j];
+ if (quad_coef <= 0)
+ quad_coef = 1e-12;
+ double delta = (-G[i]-G[j])/quad_coef;
+ double diff = alpha[i] - alpha[j];
+ alpha[i] += delta;
+ alpha[j] += delta;
+
+ if(diff > 0)
+ {
+ if(alpha[j] < 0)
+ {
+ alpha[j] = 0;
+ alpha[i] = diff;
+ }
+ }
+ else
+ {
+ if(alpha[i] < 0)
+ {
+ alpha[i] = 0;
+ alpha[j] = -diff;
+ }
+ }
+ if(diff > C_i - C_j)
+ {
+ if(alpha[i] > C_i)
+ {
+ alpha[i] = C_i;
+ alpha[j] = C_i - diff;
+ }
+ }
+ else
+ {
+ if(alpha[j] > C_j)
+ {
+ alpha[j] = C_j;
+ alpha[i] = C_j + diff;
+ }
+ }
+ }
+ else
+ {
+ double quad_coef = QD[i]+QD[j]-2*Q_i[j];
+ if (quad_coef <= 0)
+ quad_coef = 1e-12;
+ double delta = (G[i]-G[j])/quad_coef;
+ double sum = alpha[i] + alpha[j];
+ alpha[i] -= delta;
+ alpha[j] += delta;
+
+ if(sum > C_i)
+ {
+ if(alpha[i] > C_i)
+ {
+ alpha[i] = C_i;
+ alpha[j] = sum - C_i;
+ }
+ }
+ else
+ {
+ if(alpha[j] < 0)
+ {
+ alpha[j] = 0;
+ alpha[i] = sum;
+ }
+ }
+ if(sum > C_j)
+ {
+ if(alpha[j] > C_j)
+ {
+ alpha[j] = C_j;
+ alpha[i] = sum - C_j;
+ }
+ }
+ else
+ {
+ if(alpha[i] < 0)
+ {
+ alpha[i] = 0;
+ alpha[j] = sum;
+ }
+ }
+ }
+
+ // update G
+
+ double delta_alpha_i = alpha[i] - old_alpha_i;
+ double delta_alpha_j = alpha[j] - old_alpha_j;
+
+ for(int k=0;k= max_iter)
+ {
+ if(active_size < l)
+ {
+ // reconstruct the whole gradient to calculate objective value
+ reconstruct_gradient();
+ active_size = l;
+ svm.info("*");
+ }
+ System.err.print("\nWARNING: reaching max number of iterations\n");
+ }
+
+ // calculate rho
+
+ si.rho = calculate_rho();
+
+ // calculate objective value
+ {
+ double v = 0;
+ int i;
+ for(i=0;i= Gmax)
+ {
+ Gmax = -G[t];
+ Gmax_idx = t;
+ }
+ }
+ else
+ {
+ if(!is_lower_bound(t))
+ if(G[t] >= Gmax)
+ {
+ Gmax = G[t];
+ Gmax_idx = t;
+ }
+ }
+
+ int i = Gmax_idx;
+ float[] Q_i = null;
+ if(i != -1) // null Q_i not accessed: Gmax=-INF if i=-1
+ Q_i = Q.get_Q(i,active_size);
+
+ for(int j=0;j= Gmax2)
+ Gmax2 = G[j];
+ if (grad_diff > 0)
+ {
+ double obj_diff;
+ double quad_coef = QD[i]+QD[j]-2.0*y[i]*Q_i[j];
+ if (quad_coef > 0)
+ obj_diff = -(grad_diff*grad_diff)/quad_coef;
+ else
+ obj_diff = -(grad_diff*grad_diff)/1e-12;
+
+ if (obj_diff <= obj_diff_min)
+ {
+ Gmin_idx=j;
+ obj_diff_min = obj_diff;
+ }
+ }
+ }
+ }
+ else
+ {
+ if (!is_upper_bound(j))
+ {
+ double grad_diff= Gmax-G[j];
+ if (-G[j] >= Gmax2)
+ Gmax2 = -G[j];
+ if (grad_diff > 0)
+ {
+ double obj_diff;
+ double quad_coef = QD[i]+QD[j]+2.0*y[i]*Q_i[j];
+ if (quad_coef > 0)
+ obj_diff = -(grad_diff*grad_diff)/quad_coef;
+ else
+ obj_diff = -(grad_diff*grad_diff)/1e-12;
+
+ if (obj_diff <= obj_diff_min)
+ {
+ Gmin_idx=j;
+ obj_diff_min = obj_diff;
+ }
+ }
+ }
+ }
+ }
+
+ if(Gmax+Gmax2 < eps || Gmin_idx == -1)
+ return 1;
+
+ working_set[0] = Gmax_idx;
+ working_set[1] = Gmin_idx;
+ return 0;
+ }
+
+ private boolean be_shrunk(int i, double Gmax1, double Gmax2)
+ {
+ if(is_upper_bound(i))
+ {
+ if(y[i]==+1)
+ return(-G[i] > Gmax1);
+ else
+ return(-G[i] > Gmax2);
+ }
+ else if(is_lower_bound(i))
+ {
+ if(y[i]==+1)
+ return(G[i] > Gmax2);
+ else
+ return(G[i] > Gmax1);
+ }
+ else
+ return(false);
+ }
+
+ void do_shrinking()
+ {
+ int i;
+ double Gmax1 = -INF; // max { -y_i * grad(f)_i | i in I_up(\alpha) }
+ double Gmax2 = -INF; // max { y_i * grad(f)_i | i in I_low(\alpha) }
+
+ // find maximal violating pair first
+ for(i=0;i= Gmax1)
+ Gmax1 = -G[i];
+ }
+ if(!is_lower_bound(i))
+ {
+ if(G[i] >= Gmax2)
+ Gmax2 = G[i];
+ }
+ }
+ else
+ {
+ if(!is_upper_bound(i))
+ {
+ if(-G[i] >= Gmax2)
+ Gmax2 = -G[i];
+ }
+ if(!is_lower_bound(i))
+ {
+ if(G[i] >= Gmax1)
+ Gmax1 = G[i];
+ }
+ }
+ }
+
+ if(unshrink == false && Gmax1 + Gmax2 <= eps*10)
+ {
+ unshrink = true;
+ reconstruct_gradient();
+ active_size = l;
+ }
+
+ for(i=0;i i)
+ {
+ if (!be_shrunk(active_size, Gmax1, Gmax2))
+ {
+ swap_index(i,active_size);
+ break;
+ }
+ active_size--;
+ }
+ }
+ }
+
+ double calculate_rho()
+ {
+ double r;
+ int nr_free = 0;
+ double ub = INF, lb = -INF, sum_free = 0;
+ for(int i=0;i 0)
+ ub = Math.min(ub,yG);
+ else
+ lb = Math.max(lb,yG);
+ }
+ else if(is_upper_bound(i))
+ {
+ if(y[i] < 0)
+ ub = Math.min(ub,yG);
+ else
+ lb = Math.max(lb,yG);
+ }
+ else
+ {
+ ++nr_free;
+ sum_free += yG;
+ }
+ }
+
+ if(nr_free>0)
+ r = sum_free/nr_free;
+ else
+ r = (ub+lb)/2;
+
+ return r;
+ }
+
+}
+
+//
+// Solver for nu-svm classification and regression
+//
+// additional constraint: e^T \alpha = constant
+//
+final class Solver_NU extends Solver
+{
+ private SolutionInfo si;
+
+ void Solve(int l, QMatrix Q, double[] p, byte[] y,
+ double[] alpha, double Cp, double Cn, double eps,
+ SolutionInfo si, int shrinking)
+ {
+ this.si = si;
+ super.Solve(l,Q,p,y,alpha,Cp,Cn,eps,si,shrinking);
+ }
+
+ // return 1 if already optimal, return 0 otherwise
+ int select_working_set(int[] working_set)
+ {
+ // return i,j such that y_i = y_j and
+ // i: maximizes -y_i * grad(f)_i, i in I_up(\alpha)
+ // j: minimizes the decrease of obj value
+ // (if quadratic coefficeint <= 0, replace it with tau)
+ // -y_j*grad(f)_j < -y_i*grad(f)_i, j in I_low(\alpha)
+
+ double Gmaxp = -INF;
+ double Gmaxp2 = -INF;
+ int Gmaxp_idx = -1;
+
+ double Gmaxn = -INF;
+ double Gmaxn2 = -INF;
+ int Gmaxn_idx = -1;
+
+ int Gmin_idx = -1;
+ double obj_diff_min = INF;
+
+ for(int t=0;t= Gmaxp)
+ {
+ Gmaxp = -G[t];
+ Gmaxp_idx = t;
+ }
+ }
+ else
+ {
+ if(!is_lower_bound(t))
+ if(G[t] >= Gmaxn)
+ {
+ Gmaxn = G[t];
+ Gmaxn_idx = t;
+ }
+ }
+
+ int ip = Gmaxp_idx;
+ int in = Gmaxn_idx;
+ float[] Q_ip = null;
+ float[] Q_in = null;
+ if(ip != -1) // null Q_ip not accessed: Gmaxp=-INF if ip=-1
+ Q_ip = Q.get_Q(ip,active_size);
+ if(in != -1)
+ Q_in = Q.get_Q(in,active_size);
+
+ for(int j=0;j= Gmaxp2)
+ Gmaxp2 = G[j];
+ if (grad_diff > 0)
+ {
+ double obj_diff;
+ double quad_coef = QD[ip]+QD[j]-2*Q_ip[j];
+ if (quad_coef > 0)
+ obj_diff = -(grad_diff*grad_diff)/quad_coef;
+ else
+ obj_diff = -(grad_diff*grad_diff)/1e-12;
+
+ if (obj_diff <= obj_diff_min)
+ {
+ Gmin_idx=j;
+ obj_diff_min = obj_diff;
+ }
+ }
+ }
+ }
+ else
+ {
+ if (!is_upper_bound(j))
+ {
+ double grad_diff=Gmaxn-G[j];
+ if (-G[j] >= Gmaxn2)
+ Gmaxn2 = -G[j];
+ if (grad_diff > 0)
+ {
+ double obj_diff;
+ double quad_coef = QD[in]+QD[j]-2*Q_in[j];
+ if (quad_coef > 0)
+ obj_diff = -(grad_diff*grad_diff)/quad_coef;
+ else
+ obj_diff = -(grad_diff*grad_diff)/1e-12;
+
+ if (obj_diff <= obj_diff_min)
+ {
+ Gmin_idx=j;
+ obj_diff_min = obj_diff;
+ }
+ }
+ }
+ }
+ }
+
+ if(Math.max(Gmaxp+Gmaxp2,Gmaxn+Gmaxn2) < eps || Gmin_idx == -1)
+ return 1;
+
+ if(y[Gmin_idx] == +1)
+ working_set[0] = Gmaxp_idx;
+ else
+ working_set[0] = Gmaxn_idx;
+ working_set[1] = Gmin_idx;
+
+ return 0;
+ }
+
+ private boolean be_shrunk(int i, double Gmax1, double Gmax2, double Gmax3, double Gmax4)
+ {
+ if(is_upper_bound(i))
+ {
+ if(y[i]==+1)
+ return(-G[i] > Gmax1);
+ else
+ return(-G[i] > Gmax4);
+ }
+ else if(is_lower_bound(i))
+ {
+ if(y[i]==+1)
+ return(G[i] > Gmax2);
+ else
+ return(G[i] > Gmax3);
+ }
+ else
+ return(false);
+ }
+
+ void do_shrinking()
+ {
+ double Gmax1 = -INF; // max { -y_i * grad(f)_i | y_i = +1, i in I_up(\alpha) }
+ double Gmax2 = -INF; // max { y_i * grad(f)_i | y_i = +1, i in I_low(\alpha) }
+ double Gmax3 = -INF; // max { -y_i * grad(f)_i | y_i = -1, i in I_up(\alpha) }
+ double Gmax4 = -INF; // max { y_i * grad(f)_i | y_i = -1, i in I_low(\alpha) }
+
+ // find maximal violating pair first
+ int i;
+ for(i=0;i Gmax1) Gmax1 = -G[i];
+ }
+ else if(-G[i] > Gmax4) Gmax4 = -G[i];
+ }
+ if(!is_lower_bound(i))
+ {
+ if(y[i]==+1)
+ {
+ if(G[i] > Gmax2) Gmax2 = G[i];
+ }
+ else if(G[i] > Gmax3) Gmax3 = G[i];
+ }
+ }
+
+ if(unshrink == false && Math.max(Gmax1+Gmax2,Gmax3+Gmax4) <= eps*10)
+ {
+ unshrink = true;
+ reconstruct_gradient();
+ active_size = l;
+ }
+
+ for(i=0;i i)
+ {
+ if (!be_shrunk(active_size, Gmax1, Gmax2, Gmax3, Gmax4))
+ {
+ swap_index(i,active_size);
+ break;
+ }
+ active_size--;
+ }
+ }
+ }
+
+ double calculate_rho()
+ {
+ int nr_free1 = 0,nr_free2 = 0;
+ double ub1 = INF, ub2 = INF;
+ double lb1 = -INF, lb2 = -INF;
+ double sum_free1 = 0, sum_free2 = 0;
+
+ for(int i=0;i 0)
+ r1 = sum_free1/nr_free1;
+ else
+ r1 = (ub1+lb1)/2;
+
+ if(nr_free2 > 0)
+ r2 = sum_free2/nr_free2;
+ else
+ r2 = (ub2+lb2)/2;
+
+ si.r = (r1+r2)/2;
+ return (r1-r2)/2;
+ }
+}
+
+//
+// Q matrices for various formulations
+//
+class SVC_Q extends Kernel
+{
+ private final byte[] y;
+ private final Cache cache;
+ private final double[] QD;
+
+ SVC_Q(svm_problem prob, svm_parameter param, byte[] y_)
+ {
+ super(prob.l, prob.x, param);
+ y = (byte[])y_.clone();
+ cache = new Cache(prob.l,(long)(param.cache_size*(1<<20)));
+ QD = new double[prob.l];
+ for(int i=0;i 0) y[i] = +1; else y[i] = -1;
+ }
+
+ Solver s = new Solver();
+ s.Solve(l, new SVC_Q(prob,param,y), minus_ones, y,
+ alpha, Cp, Cn, param.eps, si, param.shrinking);
+
+ double sum_alpha=0;
+ for(i=0;i0)
+ y[i] = +1;
+ else
+ y[i] = -1;
+
+ double sum_pos = nu*l/2;
+ double sum_neg = nu*l/2;
+
+ for(i=0;i 0)
+ {
+ ++nSV;
+ if(prob.y[i] > 0)
+ {
+ if(Math.abs(alpha[i]) >= si.upper_bound_p)
+ ++nBSV;
+ }
+ else
+ {
+ if(Math.abs(alpha[i]) >= si.upper_bound_n)
+ ++nBSV;
+ }
+ }
+ }
+
+ svm.info("nSV = "+nSV+", nBSV = "+nBSV+"\n");
+
+ decision_function f = new decision_function();
+ f.alpha = alpha;
+ f.rho = si.rho;
+ return f;
+ }
+
+ // Platt's binary SVM Probablistic Output: an improvement from Lin et al.
+ private static void sigmoid_train(int l, double[] dec_values, double[] labels,
+ double[] probAB)
+ {
+ double A, B;
+ double prior1=0, prior0 = 0;
+ int i;
+
+ for (i=0;i 0) prior1+=1;
+ else prior0+=1;
+
+ int max_iter=100; // Maximal number of iterations
+ double min_step=1e-10; // Minimal step taken in line search
+ double sigma=1e-12; // For numerically strict PD of Hessian
+ double eps=1e-5;
+ double hiTarget=(prior1+1.0)/(prior1+2.0);
+ double loTarget=1/(prior0+2.0);
+ double[] t= new double[l];
+ double fApB,p,q,h11,h22,h21,g1,g2,det,dA,dB,gd,stepsize;
+ double newA,newB,newf,d1,d2;
+ int iter;
+
+ // Initial Point and Initial Fun Value
+ A=0.0; B=Math.log((prior0+1.0)/(prior1+1.0));
+ double fval = 0.0;
+
+ for (i=0;i0) t[i]=hiTarget;
+ else t[i]=loTarget;
+ fApB = dec_values[i]*A+B;
+ if (fApB>=0)
+ fval += t[i]*fApB + Math.log(1+Math.exp(-fApB));
+ else
+ fval += (t[i] - 1)*fApB +Math.log(1+Math.exp(fApB));
+ }
+ for (iter=0;iter= 0)
+ {
+ p=Math.exp(-fApB)/(1.0+Math.exp(-fApB));
+ q=1.0/(1.0+Math.exp(-fApB));
+ }
+ else
+ {
+ p=1.0/(1.0+Math.exp(fApB));
+ q=Math.exp(fApB)/(1.0+Math.exp(fApB));
+ }
+ d2=p*q;
+ h11+=dec_values[i]*dec_values[i]*d2;
+ h22+=d2;
+ h21+=dec_values[i]*d2;
+ d1=t[i]-p;
+ g1+=dec_values[i]*d1;
+ g2+=d1;
+ }
+
+ // Stopping Criteria
+ if (Math.abs(g1)= min_step)
+ {
+ newA = A + stepsize * dA;
+ newB = B + stepsize * dB;
+
+ // New function value
+ newf = 0.0;
+ for (i=0;i= 0)
+ newf += t[i]*fApB + Math.log(1+Math.exp(-fApB));
+ else
+ newf += (t[i] - 1)*fApB +Math.log(1+Math.exp(fApB));
+ }
+ // Check sufficient decrease
+ if (newf=max_iter)
+ svm.info("Reaching maximal iterations in two-class probability estimates\n");
+ probAB[0]=A;probAB[1]=B;
+ }
+
+ private static double sigmoid_predict(double decision_value, double A, double B)
+ {
+ double fApB = decision_value*A+B;
+ if (fApB >= 0)
+ return Math.exp(-fApB)/(1.0+Math.exp(-fApB));
+ else
+ return 1.0/(1+Math.exp(fApB)) ;
+ }
+
+ // Method 2 from the multiclass_prob paper by Wu, Lin, and Weng
+ private static void multiclass_probability(int k, double[][] r, double[] p)
+ {
+ int t,j;
+ int iter = 0, max_iter=Math.max(100,k);
+ double[][] Q=new double[k][k];
+ double[] Qp=new double[k];
+ double pQp, eps=0.005/k;
+
+ for (t=0;tmax_error)
+ max_error=error;
+ }
+ if (max_error=max_iter)
+ svm.info("Exceeds max_iter in multiclass_prob\n");
+ }
+
+ // Cross-validation decision values for probability estimates
+ private static void svm_binary_svc_probability(svm_problem prob, svm_parameter param, double Cp, double Cn, double[] probAB)
+ {
+ int i;
+ int nr_fold = 5;
+ int[] perm = new int[prob.l];
+ double[] dec_values = new double[prob.l];
+
+ // random shuffle
+ for(i=0;i0)
+ p_count++;
+ else
+ n_count++;
+
+ if(p_count==0 && n_count==0)
+ for(j=begin;j 0 && n_count == 0)
+ for(j=begin;j 0)
+ for(j=begin;j 5*std)
+ count=count+1;
+ else
+ mae+=Math.abs(ymv[i]);
+ mae /= (prob.l-count);
+ svm.info("Prob. model for test data: target value = predicted value + z,\nz: Laplace distribution e^(-|z|/sigma)/(2sigma),sigma="+mae+"\n");
+ return mae;
+ }
+
+ // label: label name, start: begin of each class, count: #data of classes, perm: indices to the original data
+ // perm, length l, must be allocated before calling this subroutine
+ private static void svm_group_classes(svm_problem prob, int[] nr_class_ret, int[][] label_ret, int[][] start_ret, int[][] count_ret, int[] perm)
+ {
+ int l = prob.l;
+ int max_nr_class = 16;
+ int nr_class = 0;
+ int[] label = new int[max_nr_class];
+ int[] count = new int[max_nr_class];
+ int[] data_label = new int[l];
+ int i;
+
+ for(i=0;i 0) ++nSV;
+ model.l = nSV;
+ model.SV = new svm_node[nSV][];
+ model.sv_coef[0] = new double[nSV];
+ model.sv_indices = new int[nSV];
+ int j = 0;
+ for(i=0;i 0)
+ {
+ model.SV[j] = prob.x[i];
+ model.sv_coef[0][j] = f.alpha[i];
+ model.sv_indices[j] = i+1;
+ ++j;
+ }
+ }
+ else
+ {
+ // classification
+ int l = prob.l;
+ int[] tmp_nr_class = new int[1];
+ int[][] tmp_label = new int[1][];
+ int[][] tmp_start = new int[1][];
+ int[][] tmp_count = new int[1][];
+ int[] perm = new int[l];
+
+ // group training data of the same class
+ svm_group_classes(prob,tmp_nr_class,tmp_label,tmp_start,tmp_count,perm);
+ int nr_class = tmp_nr_class[0];
+ int[] label = tmp_label[0];
+ int[] start = tmp_start[0];
+ int[] count = tmp_count[0];
+
+ if(nr_class == 1)
+ svm.info("WARNING: training data in only one class. See README for details.\n");
+
+ svm_node[][] x = new svm_node[l][];
+ int i;
+ for(i=0;i 0)
+ nonzero[si+k] = true;
+ for(k=0;k 0)
+ nonzero[sj+k] = true;
+ ++p;
+ }
+
+ // build output
+
+ model.nr_class = nr_class;
+
+ model.label = new int[nr_class];
+ for(i=0;i some folds may have zero elements
+ if((param.svm_type == svm_parameter.C_SVC ||
+ param.svm_type == svm_parameter.NU_SVC) && nr_fold < l)
+ {
+ int[] tmp_nr_class = new int[1];
+ int[][] tmp_label = new int[1][];
+ int[][] tmp_start = new int[1][];
+ int[][] tmp_count = new int[1][];
+
+ svm_group_classes(prob,tmp_nr_class,tmp_label,tmp_start,tmp_count,perm);
+
+ int nr_class = tmp_nr_class[0];
+ int[] start = tmp_start[0];
+ int[] count = tmp_count[0];
+
+ // random shuffle and then data grouped by fold using the array perm
+ int[] fold_count = new int[nr_fold];
+ int c;
+ int[] index = new int[l];
+ for(i=0;i0)?1:-1;
+ else
+ return sum;
+ }
+ else
+ {
+ int nr_class = model.nr_class;
+ int l = model.l;
+
+ double[] kvalue = new double[l];
+ for(i=0;i 0)
+ ++vote[i];
+ else
+ ++vote[j];
+ p++;
+ }
+
+ int vote_max_idx = 0;
+ for(i=1;i vote[vote_max_idx])
+ vote_max_idx = i;
+
+ return model.label[vote_max_idx];
+ }
+ }
+
+ public static double svm_predict(svm_model model, svm_node[] x)
+ {
+ int nr_class = model.nr_class;
+ double[] dec_values;
+ if(model.param.svm_type == svm_parameter.ONE_CLASS ||
+ model.param.svm_type == svm_parameter.EPSILON_SVR ||
+ model.param.svm_type == svm_parameter.NU_SVR)
+ dec_values = new double[1];
+ else
+ dec_values = new double[nr_class*(nr_class-1)/2];
+ double pred_result = svm_predict_values(model, x, dec_values);
+ return pred_result;
+ }
+
+ public static double svm_predict_probability(svm_model model, svm_node[] x, double[] prob_estimates)
+ {
+ if ((model.param.svm_type == svm_parameter.C_SVC || model.param.svm_type == svm_parameter.NU_SVC) &&
+ model.probA!=null && model.probB!=null)
+ {
+ int i;
+ int nr_class = model.nr_class;
+ double[] dec_values = new double[nr_class*(nr_class-1)/2];
+ svm_predict_values(model, x, dec_values);
+
+ double min_prob=1e-7;
+ double[][] pairwise_prob=new double[nr_class][nr_class];
+
+ int k=0;
+ for(i=0;i prob_estimates[prob_max_idx])
+ prob_max_idx = i;
+ return model.label[prob_max_idx];
+ }
+ else
+ return svm_predict(model, x);
+ }
+
+ static final String svm_type_table[] =
+ {
+ "c_svc","nu_svc","one_class","epsilon_svr","nu_svr",
+ };
+
+ static final String kernel_type_table[]=
+ {
+ "linear","polynomial","rbf","sigmoid","precomputed"
+ };
+
+ public static void svm_save_model(String model_file_name, svm_model model) throws IOException
+ {
+ DataOutputStream fp = new DataOutputStream(new BufferedOutputStream(new FileOutputStream(model_file_name)));
+
+ svm_parameter param = model.param;
+
+ fp.writeBytes("svm_type "+svm_type_table[param.svm_type]+"\n");
+ fp.writeBytes("kernel_type "+kernel_type_table[param.kernel_type]+"\n");
+
+ if(param.kernel_type == svm_parameter.POLY)
+ fp.writeBytes("degree "+param.degree+"\n");
+
+ if(param.kernel_type == svm_parameter.POLY ||
+ param.kernel_type == svm_parameter.RBF ||
+ param.kernel_type == svm_parameter.SIGMOID)
+ fp.writeBytes("gamma "+param.gamma+"\n");
+
+ if(param.kernel_type == svm_parameter.POLY ||
+ param.kernel_type == svm_parameter.SIGMOID)
+ fp.writeBytes("coef0 "+param.coef0+"\n");
+
+ int nr_class = model.nr_class;
+ int l = model.l;
+ fp.writeBytes("nr_class "+nr_class+"\n");
+ fp.writeBytes("total_sv "+l+"\n");
+
+ {
+ fp.writeBytes("rho");
+ for(int i=0;i 1)
+ return "nu <= 0 or nu > 1";
+
+ if(svm_type == svm_parameter.EPSILON_SVR)
+ if(param.p < 0)
+ return "p < 0";
+
+ if(param.shrinking != 0 &&
+ param.shrinking != 1)
+ return "shrinking != 0 and shrinking != 1";
+
+ if(param.probability != 0 &&
+ param.probability != 1)
+ return "probability != 0 and probability != 1";
+
+ if(param.probability == 1 &&
+ svm_type == svm_parameter.ONE_CLASS)
+ return "one-class SVM probability output not supported yet";
+
+ // check whether nu-svc is feasible
+
+ if(svm_type == svm_parameter.NU_SVC)
+ {
+ int l = prob.l;
+ int max_nr_class = 16;
+ int nr_class = 0;
+ int[] label = new int[max_nr_class];
+ int[] count = new int[max_nr_class];
+
+ int i;
+ for(i=0;i Math.min(n1,n2))
+ return "specified nu is infeasible";
+ }
+ }
+ }
+
+ return null;
+ }
+
+ public static int svm_check_probability_model(svm_model model)
+ {
+ if (((model.param.svm_type == svm_parameter.C_SVC || model.param.svm_type == svm_parameter.NU_SVC) &&
+ model.probA!=null && model.probB!=null) ||
+ ((model.param.svm_type == svm_parameter.EPSILON_SVR || model.param.svm_type == svm_parameter.NU_SVR) &&
+ model.probA!=null))
+ return 1;
+ else
+ return 0;
+ }
+
+ public static void svm_set_print_string_function(svm_print_interface print_func)
+ {
+ if (print_func == null)
+ svm_print_string = svm_print_stdout;
+ else
+ svm_print_string = print_func;
+ }
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm.m4 b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm.m4
new file mode 100644
index 0000000..ca2621d
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm.m4
@@ -0,0 +1,2860 @@
+define(`swap',`do {$1 tmp=$2; $2=$3; $3=tmp;} while(false)')
+define(`Qfloat',`float')
+define(`SIZE_OF_QFLOAT',4)
+define(`TAU',1e-12)
+changecom(`//',`')
+package libsvm;
+import java.io.*;
+import java.util.*;
+
+//
+// Kernel Cache
+//
+// l is the number of total data items
+// size is the cache size limit in bytes
+//
+class Cache {
+ private final int l;
+ private long size;
+ private final class head_t
+ {
+ head_t prev, next; // a cicular list
+ Qfloat[] data;
+ int len; // data[0,len) is cached in this entry
+ }
+ private final head_t[] head;
+ private head_t lru_head;
+
+ Cache(int l_, long size_)
+ {
+ l = l_;
+ size = size_;
+ head = new head_t[l];
+ for(int i=0;i= len if nothing needs to be filled)
+ // java: simulate pointer using single-element array
+ int get_data(int index, Qfloat[][] data, int len)
+ {
+ head_t h = head[index];
+ if(h.len > 0) lru_delete(h);
+ int more = len - h.len;
+
+ if(more > 0)
+ {
+ // free old space
+ while(size < more)
+ {
+ head_t old = lru_head.next;
+ lru_delete(old);
+ size += old.len;
+ old.data = null;
+ old.len = 0;
+ }
+
+ // allocate new space
+ Qfloat[] new_data = new Qfloat[len];
+ if(h.data != null) System.arraycopy(h.data,0,new_data,0,h.len);
+ h.data = new_data;
+ size -= more;
+ swap(int,h.len,len);
+ }
+
+ lru_insert(h);
+ data[0] = h.data;
+ return len;
+ }
+
+ void swap_index(int i, int j)
+ {
+ if(i==j) return;
+
+ if(head[i].len > 0) lru_delete(head[i]);
+ if(head[j].len > 0) lru_delete(head[j]);
+ swap(Qfloat[],head[i].data,head[j].data);
+ swap(int,head[i].len,head[j].len);
+ if(head[i].len > 0) lru_insert(head[i]);
+ if(head[j].len > 0) lru_insert(head[j]);
+
+ if(i>j) swap(int,i,j);
+ for(head_t h = lru_head.next; h!=lru_head; h=h.next)
+ {
+ if(h.len > i)
+ {
+ if(h.len > j)
+ swap(Qfloat,h.data[i],h.data[j]);
+ else
+ {
+ // give up
+ lru_delete(h);
+ size += h.len;
+ h.data = null;
+ h.len = 0;
+ }
+ }
+ }
+ }
+}
+
+//
+// Kernel evaluation
+//
+// the static method k_function is for doing single kernel evaluation
+// the constructor of Kernel prepares to calculate the l*l kernel matrix
+// the member function get_Q is for getting one column from the Q Matrix
+//
+abstract class QMatrix {
+ abstract Qfloat[] get_Q(int column, int len);
+ abstract double[] get_QD();
+ abstract void swap_index(int i, int j);
+};
+
+abstract class Kernel extends QMatrix {
+ private svm_node[][] x;
+ private final double[] x_square;
+
+ // svm_parameter
+ private final int kernel_type;
+ private final int degree;
+ private final double gamma;
+ private final double coef0;
+
+ abstract Qfloat[] get_Q(int column, int len);
+ abstract double[] get_QD();
+
+ void swap_index(int i, int j)
+ {
+ swap(svm_node[],x[i],x[j]);
+ if(x_square != null) swap(double,x_square[i],x_square[j]);
+ }
+
+ private static double powi(double base, int times)
+ {
+ double tmp = base, ret = 1.0;
+
+ for(int t=times; t>0; t/=2)
+ {
+ if(t%2==1) ret*=tmp;
+ tmp = tmp * tmp;
+ }
+ return ret;
+ }
+
+ double kernel_function(int i, int j)
+ {
+ switch(kernel_type)
+ {
+ case svm_parameter.LINEAR:
+ return dot(x[i],x[j]);
+ case svm_parameter.POLY:
+ return powi(gamma*dot(x[i],x[j])+coef0,degree);
+ case svm_parameter.RBF:
+ return Math.exp(-gamma*(x_square[i]+x_square[j]-2*dot(x[i],x[j])));
+ case svm_parameter.SIGMOID:
+ return Math.tanh(gamma*dot(x[i],x[j])+coef0);
+ case svm_parameter.PRECOMPUTED:
+ return x[i][(int)(x[j][0].value)].value;
+ default:
+ return 0; // java
+ }
+ }
+
+ Kernel(int l, svm_node[][] x_, svm_parameter param)
+ {
+ this.kernel_type = param.kernel_type;
+ this.degree = param.degree;
+ this.gamma = param.gamma;
+ this.coef0 = param.coef0;
+
+ x = (svm_node[][])x_.clone();
+
+ if(kernel_type == svm_parameter.RBF)
+ {
+ x_square = new double[l];
+ for(int i=0;i y[j].index)
+ ++j;
+ else
+ ++i;
+ }
+ }
+ return sum;
+ }
+
+ static double k_function(svm_node[] x, svm_node[] y,
+ svm_parameter param)
+ {
+ switch(param.kernel_type)
+ {
+ case svm_parameter.LINEAR:
+ return dot(x,y);
+ case svm_parameter.POLY:
+ return powi(param.gamma*dot(x,y)+param.coef0,param.degree);
+ case svm_parameter.RBF:
+ {
+ double sum = 0;
+ int xlen = x.length;
+ int ylen = y.length;
+ int i = 0;
+ int j = 0;
+ while(i < xlen && j < ylen)
+ {
+ if(x[i].index == y[j].index)
+ {
+ double d = x[i++].value - y[j++].value;
+ sum += d*d;
+ }
+ else if(x[i].index > y[j].index)
+ {
+ sum += y[j].value * y[j].value;
+ ++j;
+ }
+ else
+ {
+ sum += x[i].value * x[i].value;
+ ++i;
+ }
+ }
+
+ while(i < xlen)
+ {
+ sum += x[i].value * x[i].value;
+ ++i;
+ }
+
+ while(j < ylen)
+ {
+ sum += y[j].value * y[j].value;
+ ++j;
+ }
+
+ return Math.exp(-param.gamma*sum);
+ }
+ case svm_parameter.SIGMOID:
+ return Math.tanh(param.gamma*dot(x,y)+param.coef0);
+ case svm_parameter.PRECOMPUTED:
+ return x[(int)(y[0].value)].value;
+ default:
+ return 0; // java
+ }
+ }
+}
+
+// An SMO algorithm in Fan et al., JMLR 6(2005), p. 1889--1918
+// Solves:
+//
+// min 0.5(\alpha^T Q \alpha) + p^T \alpha
+//
+// y^T \alpha = \delta
+// y_i = +1 or -1
+// 0 <= alpha_i <= Cp for y_i = 1
+// 0 <= alpha_i <= Cn for y_i = -1
+//
+// Given:
+//
+// Q, p, y, Cp, Cn, and an initial feasible point \alpha
+// l is the size of vectors and matrices
+// eps is the stopping tolerance
+//
+// solution will be put in \alpha, objective value will be put in obj
+//
+class Solver {
+ int active_size;
+ byte[] y;
+ double[] G; // gradient of objective function
+ static final byte LOWER_BOUND = 0;
+ static final byte UPPER_BOUND = 1;
+ static final byte FREE = 2;
+ byte[] alpha_status; // LOWER_BOUND, UPPER_BOUND, FREE
+ double[] alpha;
+ QMatrix Q;
+ double[] QD;
+ double eps;
+ double Cp,Cn;
+ double[] p;
+ int[] active_set;
+ double[] G_bar; // gradient, if we treat free variables as 0
+ int l;
+ boolean unshrink; // XXX
+
+ static final double INF = java.lang.Double.POSITIVE_INFINITY;
+
+ double get_C(int i)
+ {
+ return (y[i] > 0)? Cp : Cn;
+ }
+ void update_alpha_status(int i)
+ {
+ if(alpha[i] >= get_C(i))
+ alpha_status[i] = UPPER_BOUND;
+ else if(alpha[i] <= 0)
+ alpha_status[i] = LOWER_BOUND;
+ else alpha_status[i] = FREE;
+ }
+ boolean is_upper_bound(int i) { return alpha_status[i] == UPPER_BOUND; }
+ boolean is_lower_bound(int i) { return alpha_status[i] == LOWER_BOUND; }
+ boolean is_free(int i) { return alpha_status[i] == FREE; }
+
+ // java: information about solution except alpha,
+ // because we cannot return multiple values otherwise...
+ static class SolutionInfo {
+ double obj;
+ double rho;
+ double upper_bound_p;
+ double upper_bound_n;
+ double r; // for Solver_NU
+ }
+
+ void swap_index(int i, int j)
+ {
+ Q.swap_index(i,j);
+ swap(byte, y[i],y[j]);
+ swap(double, G[i],G[j]);
+ swap(byte, alpha_status[i],alpha_status[j]);
+ swap(double, alpha[i],alpha[j]);
+ swap(double, p[i],p[j]);
+ swap(int, active_set[i],active_set[j]);
+ swap(double, G_bar[i],G_bar[j]);
+ }
+
+ void reconstruct_gradient()
+ {
+ // reconstruct inactive elements of G from G_bar and free variables
+
+ if(active_size == l) return;
+
+ int i,j;
+ int nr_free = 0;
+
+ for(j=active_size;j 2*active_size*(l-active_size))
+ {
+ for(i=active_size;iInteger.MAX_VALUE/100 ? Integer.MAX_VALUE : 100*l);
+ int counter = Math.min(l,1000)+1;
+ int[] working_set = new int[2];
+
+ while(iter < max_iter)
+ {
+ // show progress and do shrinking
+
+ if(--counter == 0)
+ {
+ counter = Math.min(l,1000);
+ if(shrinking!=0) do_shrinking();
+ svm.info(".");
+ }
+
+ if(select_working_set(working_set)!=0)
+ {
+ // reconstruct the whole gradient
+ reconstruct_gradient();
+ // reset active set size and check
+ active_size = l;
+ svm.info("*");
+ if(select_working_set(working_set)!=0)
+ break;
+ else
+ counter = 1; // do shrinking next iteration
+ }
+
+ int i = working_set[0];
+ int j = working_set[1];
+
+ ++iter;
+
+ // update alpha[i] and alpha[j], handle bounds carefully
+
+ Qfloat[] Q_i = Q.get_Q(i,active_size);
+ Qfloat[] Q_j = Q.get_Q(j,active_size);
+
+ double C_i = get_C(i);
+ double C_j = get_C(j);
+
+ double old_alpha_i = alpha[i];
+ double old_alpha_j = alpha[j];
+
+ if(y[i]!=y[j])
+ {
+ double quad_coef = QD[i]+QD[j]+2*Q_i[j];
+ if (quad_coef <= 0)
+ quad_coef = TAU;
+ double delta = (-G[i]-G[j])/quad_coef;
+ double diff = alpha[i] - alpha[j];
+ alpha[i] += delta;
+ alpha[j] += delta;
+
+ if(diff > 0)
+ {
+ if(alpha[j] < 0)
+ {
+ alpha[j] = 0;
+ alpha[i] = diff;
+ }
+ }
+ else
+ {
+ if(alpha[i] < 0)
+ {
+ alpha[i] = 0;
+ alpha[j] = -diff;
+ }
+ }
+ if(diff > C_i - C_j)
+ {
+ if(alpha[i] > C_i)
+ {
+ alpha[i] = C_i;
+ alpha[j] = C_i - diff;
+ }
+ }
+ else
+ {
+ if(alpha[j] > C_j)
+ {
+ alpha[j] = C_j;
+ alpha[i] = C_j + diff;
+ }
+ }
+ }
+ else
+ {
+ double quad_coef = QD[i]+QD[j]-2*Q_i[j];
+ if (quad_coef <= 0)
+ quad_coef = TAU;
+ double delta = (G[i]-G[j])/quad_coef;
+ double sum = alpha[i] + alpha[j];
+ alpha[i] -= delta;
+ alpha[j] += delta;
+
+ if(sum > C_i)
+ {
+ if(alpha[i] > C_i)
+ {
+ alpha[i] = C_i;
+ alpha[j] = sum - C_i;
+ }
+ }
+ else
+ {
+ if(alpha[j] < 0)
+ {
+ alpha[j] = 0;
+ alpha[i] = sum;
+ }
+ }
+ if(sum > C_j)
+ {
+ if(alpha[j] > C_j)
+ {
+ alpha[j] = C_j;
+ alpha[i] = sum - C_j;
+ }
+ }
+ else
+ {
+ if(alpha[i] < 0)
+ {
+ alpha[i] = 0;
+ alpha[j] = sum;
+ }
+ }
+ }
+
+ // update G
+
+ double delta_alpha_i = alpha[i] - old_alpha_i;
+ double delta_alpha_j = alpha[j] - old_alpha_j;
+
+ for(int k=0;k= max_iter)
+ {
+ if(active_size < l)
+ {
+ // reconstruct the whole gradient to calculate objective value
+ reconstruct_gradient();
+ active_size = l;
+ svm.info("*");
+ }
+ System.err.print("\nWARNING: reaching max number of iterations\n");
+ }
+
+ // calculate rho
+
+ si.rho = calculate_rho();
+
+ // calculate objective value
+ {
+ double v = 0;
+ int i;
+ for(i=0;i= Gmax)
+ {
+ Gmax = -G[t];
+ Gmax_idx = t;
+ }
+ }
+ else
+ {
+ if(!is_lower_bound(t))
+ if(G[t] >= Gmax)
+ {
+ Gmax = G[t];
+ Gmax_idx = t;
+ }
+ }
+
+ int i = Gmax_idx;
+ Qfloat[] Q_i = null;
+ if(i != -1) // null Q_i not accessed: Gmax=-INF if i=-1
+ Q_i = Q.get_Q(i,active_size);
+
+ for(int j=0;j= Gmax2)
+ Gmax2 = G[j];
+ if (grad_diff > 0)
+ {
+ double obj_diff;
+ double quad_coef = QD[i]+QD[j]-2.0*y[i]*Q_i[j];
+ if (quad_coef > 0)
+ obj_diff = -(grad_diff*grad_diff)/quad_coef;
+ else
+ obj_diff = -(grad_diff*grad_diff)/TAU;
+
+ if (obj_diff <= obj_diff_min)
+ {
+ Gmin_idx=j;
+ obj_diff_min = obj_diff;
+ }
+ }
+ }
+ }
+ else
+ {
+ if (!is_upper_bound(j))
+ {
+ double grad_diff= Gmax-G[j];
+ if (-G[j] >= Gmax2)
+ Gmax2 = -G[j];
+ if (grad_diff > 0)
+ {
+ double obj_diff;
+ double quad_coef = QD[i]+QD[j]+2.0*y[i]*Q_i[j];
+ if (quad_coef > 0)
+ obj_diff = -(grad_diff*grad_diff)/quad_coef;
+ else
+ obj_diff = -(grad_diff*grad_diff)/TAU;
+
+ if (obj_diff <= obj_diff_min)
+ {
+ Gmin_idx=j;
+ obj_diff_min = obj_diff;
+ }
+ }
+ }
+ }
+ }
+
+ if(Gmax+Gmax2 < eps || Gmin_idx == -1)
+ return 1;
+
+ working_set[0] = Gmax_idx;
+ working_set[1] = Gmin_idx;
+ return 0;
+ }
+
+ private boolean be_shrunk(int i, double Gmax1, double Gmax2)
+ {
+ if(is_upper_bound(i))
+ {
+ if(y[i]==+1)
+ return(-G[i] > Gmax1);
+ else
+ return(-G[i] > Gmax2);
+ }
+ else if(is_lower_bound(i))
+ {
+ if(y[i]==+1)
+ return(G[i] > Gmax2);
+ else
+ return(G[i] > Gmax1);
+ }
+ else
+ return(false);
+ }
+
+ void do_shrinking()
+ {
+ int i;
+ double Gmax1 = -INF; // max { -y_i * grad(f)_i | i in I_up(\alpha) }
+ double Gmax2 = -INF; // max { y_i * grad(f)_i | i in I_low(\alpha) }
+
+ // find maximal violating pair first
+ for(i=0;i= Gmax1)
+ Gmax1 = -G[i];
+ }
+ if(!is_lower_bound(i))
+ {
+ if(G[i] >= Gmax2)
+ Gmax2 = G[i];
+ }
+ }
+ else
+ {
+ if(!is_upper_bound(i))
+ {
+ if(-G[i] >= Gmax2)
+ Gmax2 = -G[i];
+ }
+ if(!is_lower_bound(i))
+ {
+ if(G[i] >= Gmax1)
+ Gmax1 = G[i];
+ }
+ }
+ }
+
+ if(unshrink == false && Gmax1 + Gmax2 <= eps*10)
+ {
+ unshrink = true;
+ reconstruct_gradient();
+ active_size = l;
+ }
+
+ for(i=0;i i)
+ {
+ if (!be_shrunk(active_size, Gmax1, Gmax2))
+ {
+ swap_index(i,active_size);
+ break;
+ }
+ active_size--;
+ }
+ }
+ }
+
+ double calculate_rho()
+ {
+ double r;
+ int nr_free = 0;
+ double ub = INF, lb = -INF, sum_free = 0;
+ for(int i=0;i 0)
+ ub = Math.min(ub,yG);
+ else
+ lb = Math.max(lb,yG);
+ }
+ else if(is_upper_bound(i))
+ {
+ if(y[i] < 0)
+ ub = Math.min(ub,yG);
+ else
+ lb = Math.max(lb,yG);
+ }
+ else
+ {
+ ++nr_free;
+ sum_free += yG;
+ }
+ }
+
+ if(nr_free>0)
+ r = sum_free/nr_free;
+ else
+ r = (ub+lb)/2;
+
+ return r;
+ }
+
+}
+
+//
+// Solver for nu-svm classification and regression
+//
+// additional constraint: e^T \alpha = constant
+//
+final class Solver_NU extends Solver
+{
+ private SolutionInfo si;
+
+ void Solve(int l, QMatrix Q, double[] p, byte[] y,
+ double[] alpha, double Cp, double Cn, double eps,
+ SolutionInfo si, int shrinking)
+ {
+ this.si = si;
+ super.Solve(l,Q,p,y,alpha,Cp,Cn,eps,si,shrinking);
+ }
+
+ // return 1 if already optimal, return 0 otherwise
+ int select_working_set(int[] working_set)
+ {
+ // return i,j such that y_i = y_j and
+ // i: maximizes -y_i * grad(f)_i, i in I_up(\alpha)
+ // j: minimizes the decrease of obj value
+ // (if quadratic coefficeint <= 0, replace it with tau)
+ // -y_j*grad(f)_j < -y_i*grad(f)_i, j in I_low(\alpha)
+
+ double Gmaxp = -INF;
+ double Gmaxp2 = -INF;
+ int Gmaxp_idx = -1;
+
+ double Gmaxn = -INF;
+ double Gmaxn2 = -INF;
+ int Gmaxn_idx = -1;
+
+ int Gmin_idx = -1;
+ double obj_diff_min = INF;
+
+ for(int t=0;t= Gmaxp)
+ {
+ Gmaxp = -G[t];
+ Gmaxp_idx = t;
+ }
+ }
+ else
+ {
+ if(!is_lower_bound(t))
+ if(G[t] >= Gmaxn)
+ {
+ Gmaxn = G[t];
+ Gmaxn_idx = t;
+ }
+ }
+
+ int ip = Gmaxp_idx;
+ int in = Gmaxn_idx;
+ Qfloat[] Q_ip = null;
+ Qfloat[] Q_in = null;
+ if(ip != -1) // null Q_ip not accessed: Gmaxp=-INF if ip=-1
+ Q_ip = Q.get_Q(ip,active_size);
+ if(in != -1)
+ Q_in = Q.get_Q(in,active_size);
+
+ for(int j=0;j= Gmaxp2)
+ Gmaxp2 = G[j];
+ if (grad_diff > 0)
+ {
+ double obj_diff;
+ double quad_coef = QD[ip]+QD[j]-2*Q_ip[j];
+ if (quad_coef > 0)
+ obj_diff = -(grad_diff*grad_diff)/quad_coef;
+ else
+ obj_diff = -(grad_diff*grad_diff)/TAU;
+
+ if (obj_diff <= obj_diff_min)
+ {
+ Gmin_idx=j;
+ obj_diff_min = obj_diff;
+ }
+ }
+ }
+ }
+ else
+ {
+ if (!is_upper_bound(j))
+ {
+ double grad_diff=Gmaxn-G[j];
+ if (-G[j] >= Gmaxn2)
+ Gmaxn2 = -G[j];
+ if (grad_diff > 0)
+ {
+ double obj_diff;
+ double quad_coef = QD[in]+QD[j]-2*Q_in[j];
+ if (quad_coef > 0)
+ obj_diff = -(grad_diff*grad_diff)/quad_coef;
+ else
+ obj_diff = -(grad_diff*grad_diff)/TAU;
+
+ if (obj_diff <= obj_diff_min)
+ {
+ Gmin_idx=j;
+ obj_diff_min = obj_diff;
+ }
+ }
+ }
+ }
+ }
+
+ if(Math.max(Gmaxp+Gmaxp2,Gmaxn+Gmaxn2) < eps || Gmin_idx == -1)
+ return 1;
+
+ if(y[Gmin_idx] == +1)
+ working_set[0] = Gmaxp_idx;
+ else
+ working_set[0] = Gmaxn_idx;
+ working_set[1] = Gmin_idx;
+
+ return 0;
+ }
+
+ private boolean be_shrunk(int i, double Gmax1, double Gmax2, double Gmax3, double Gmax4)
+ {
+ if(is_upper_bound(i))
+ {
+ if(y[i]==+1)
+ return(-G[i] > Gmax1);
+ else
+ return(-G[i] > Gmax4);
+ }
+ else if(is_lower_bound(i))
+ {
+ if(y[i]==+1)
+ return(G[i] > Gmax2);
+ else
+ return(G[i] > Gmax3);
+ }
+ else
+ return(false);
+ }
+
+ void do_shrinking()
+ {
+ double Gmax1 = -INF; // max { -y_i * grad(f)_i | y_i = +1, i in I_up(\alpha) }
+ double Gmax2 = -INF; // max { y_i * grad(f)_i | y_i = +1, i in I_low(\alpha) }
+ double Gmax3 = -INF; // max { -y_i * grad(f)_i | y_i = -1, i in I_up(\alpha) }
+ double Gmax4 = -INF; // max { y_i * grad(f)_i | y_i = -1, i in I_low(\alpha) }
+
+ // find maximal violating pair first
+ int i;
+ for(i=0;i Gmax1) Gmax1 = -G[i];
+ }
+ else if(-G[i] > Gmax4) Gmax4 = -G[i];
+ }
+ if(!is_lower_bound(i))
+ {
+ if(y[i]==+1)
+ {
+ if(G[i] > Gmax2) Gmax2 = G[i];
+ }
+ else if(G[i] > Gmax3) Gmax3 = G[i];
+ }
+ }
+
+ if(unshrink == false && Math.max(Gmax1+Gmax2,Gmax3+Gmax4) <= eps*10)
+ {
+ unshrink = true;
+ reconstruct_gradient();
+ active_size = l;
+ }
+
+ for(i=0;i i)
+ {
+ if (!be_shrunk(active_size, Gmax1, Gmax2, Gmax3, Gmax4))
+ {
+ swap_index(i,active_size);
+ break;
+ }
+ active_size--;
+ }
+ }
+ }
+
+ double calculate_rho()
+ {
+ int nr_free1 = 0,nr_free2 = 0;
+ double ub1 = INF, ub2 = INF;
+ double lb1 = -INF, lb2 = -INF;
+ double sum_free1 = 0, sum_free2 = 0;
+
+ for(int i=0;i 0)
+ r1 = sum_free1/nr_free1;
+ else
+ r1 = (ub1+lb1)/2;
+
+ if(nr_free2 > 0)
+ r2 = sum_free2/nr_free2;
+ else
+ r2 = (ub2+lb2)/2;
+
+ si.r = (r1+r2)/2;
+ return (r1-r2)/2;
+ }
+}
+
+//
+// Q matrices for various formulations
+//
+class SVC_Q extends Kernel
+{
+ private final byte[] y;
+ private final Cache cache;
+ private final double[] QD;
+
+ SVC_Q(svm_problem prob, svm_parameter param, byte[] y_)
+ {
+ super(prob.l, prob.x, param);
+ y = (byte[])y_.clone();
+ cache = new Cache(prob.l,(long)(param.cache_size*(1<<20)));
+ QD = new double[prob.l];
+ for(int i=0;i 0) y[i] = +1; else y[i] = -1;
+ }
+
+ Solver s = new Solver();
+ s.Solve(l, new SVC_Q(prob,param,y), minus_ones, y,
+ alpha, Cp, Cn, param.eps, si, param.shrinking);
+
+ double sum_alpha=0;
+ for(i=0;i0)
+ y[i] = +1;
+ else
+ y[i] = -1;
+
+ double sum_pos = nu*l/2;
+ double sum_neg = nu*l/2;
+
+ for(i=0;i 0)
+ {
+ ++nSV;
+ if(prob.y[i] > 0)
+ {
+ if(Math.abs(alpha[i]) >= si.upper_bound_p)
+ ++nBSV;
+ }
+ else
+ {
+ if(Math.abs(alpha[i]) >= si.upper_bound_n)
+ ++nBSV;
+ }
+ }
+ }
+
+ svm.info("nSV = "+nSV+", nBSV = "+nBSV+"\n");
+
+ decision_function f = new decision_function();
+ f.alpha = alpha;
+ f.rho = si.rho;
+ return f;
+ }
+
+ // Platt's binary SVM Probablistic Output: an improvement from Lin et al.
+ private static void sigmoid_train(int l, double[] dec_values, double[] labels,
+ double[] probAB)
+ {
+ double A, B;
+ double prior1=0, prior0 = 0;
+ int i;
+
+ for (i=0;i 0) prior1+=1;
+ else prior0+=1;
+
+ int max_iter=100; // Maximal number of iterations
+ double min_step=1e-10; // Minimal step taken in line search
+ double sigma=1e-12; // For numerically strict PD of Hessian
+ double eps=1e-5;
+ double hiTarget=(prior1+1.0)/(prior1+2.0);
+ double loTarget=1/(prior0+2.0);
+ double[] t= new double[l];
+ double fApB,p,q,h11,h22,h21,g1,g2,det,dA,dB,gd,stepsize;
+ double newA,newB,newf,d1,d2;
+ int iter;
+
+ // Initial Point and Initial Fun Value
+ A=0.0; B=Math.log((prior0+1.0)/(prior1+1.0));
+ double fval = 0.0;
+
+ for (i=0;i0) t[i]=hiTarget;
+ else t[i]=loTarget;
+ fApB = dec_values[i]*A+B;
+ if (fApB>=0)
+ fval += t[i]*fApB + Math.log(1+Math.exp(-fApB));
+ else
+ fval += (t[i] - 1)*fApB +Math.log(1+Math.exp(fApB));
+ }
+ for (iter=0;iter= 0)
+ {
+ p=Math.exp(-fApB)/(1.0+Math.exp(-fApB));
+ q=1.0/(1.0+Math.exp(-fApB));
+ }
+ else
+ {
+ p=1.0/(1.0+Math.exp(fApB));
+ q=Math.exp(fApB)/(1.0+Math.exp(fApB));
+ }
+ d2=p*q;
+ h11+=dec_values[i]*dec_values[i]*d2;
+ h22+=d2;
+ h21+=dec_values[i]*d2;
+ d1=t[i]-p;
+ g1+=dec_values[i]*d1;
+ g2+=d1;
+ }
+
+ // Stopping Criteria
+ if (Math.abs(g1)= min_step)
+ {
+ newA = A + stepsize * dA;
+ newB = B + stepsize * dB;
+
+ // New function value
+ newf = 0.0;
+ for (i=0;i= 0)
+ newf += t[i]*fApB + Math.log(1+Math.exp(-fApB));
+ else
+ newf += (t[i] - 1)*fApB +Math.log(1+Math.exp(fApB));
+ }
+ // Check sufficient decrease
+ if (newf=max_iter)
+ svm.info("Reaching maximal iterations in two-class probability estimates\n");
+ probAB[0]=A;probAB[1]=B;
+ }
+
+ private static double sigmoid_predict(double decision_value, double A, double B)
+ {
+ double fApB = decision_value*A+B;
+ if (fApB >= 0)
+ return Math.exp(-fApB)/(1.0+Math.exp(-fApB));
+ else
+ return 1.0/(1+Math.exp(fApB)) ;
+ }
+
+ // Method 2 from the multiclass_prob paper by Wu, Lin, and Weng
+ private static void multiclass_probability(int k, double[][] r, double[] p)
+ {
+ int t,j;
+ int iter = 0, max_iter=Math.max(100,k);
+ double[][] Q=new double[k][k];
+ double[] Qp=new double[k];
+ double pQp, eps=0.005/k;
+
+ for (t=0;tmax_error)
+ max_error=error;
+ }
+ if (max_error=max_iter)
+ svm.info("Exceeds max_iter in multiclass_prob\n");
+ }
+
+ // Cross-validation decision values for probability estimates
+ private static void svm_binary_svc_probability(svm_problem prob, svm_parameter param, double Cp, double Cn, double[] probAB)
+ {
+ int i;
+ int nr_fold = 5;
+ int[] perm = new int[prob.l];
+ double[] dec_values = new double[prob.l];
+
+ // random shuffle
+ for(i=0;i0)
+ p_count++;
+ else
+ n_count++;
+
+ if(p_count==0 && n_count==0)
+ for(j=begin;j 0 && n_count == 0)
+ for(j=begin;j 0)
+ for(j=begin;j 5*std)
+ count=count+1;
+ else
+ mae+=Math.abs(ymv[i]);
+ mae /= (prob.l-count);
+ svm.info("Prob. model for test data: target value = predicted value + z,\nz: Laplace distribution e^(-|z|/sigma)/(2sigma),sigma="+mae+"\n");
+ return mae;
+ }
+
+ // label: label name, start: begin of each class, count: #data of classes, perm: indices to the original data
+ // perm, length l, must be allocated before calling this subroutine
+ private static void svm_group_classes(svm_problem prob, int[] nr_class_ret, int[][] label_ret, int[][] start_ret, int[][] count_ret, int[] perm)
+ {
+ int l = prob.l;
+ int max_nr_class = 16;
+ int nr_class = 0;
+ int[] label = new int[max_nr_class];
+ int[] count = new int[max_nr_class];
+ int[] data_label = new int[l];
+ int i;
+
+ for(i=0;i 0) ++nSV;
+ model.l = nSV;
+ model.SV = new svm_node[nSV][];
+ model.sv_coef[0] = new double[nSV];
+ model.sv_indices = new int[nSV];
+ int j = 0;
+ for(i=0;i 0)
+ {
+ model.SV[j] = prob.x[i];
+ model.sv_coef[0][j] = f.alpha[i];
+ model.sv_indices[j] = i+1;
+ ++j;
+ }
+ }
+ else
+ {
+ // classification
+ int l = prob.l;
+ int[] tmp_nr_class = new int[1];
+ int[][] tmp_label = new int[1][];
+ int[][] tmp_start = new int[1][];
+ int[][] tmp_count = new int[1][];
+ int[] perm = new int[l];
+
+ // group training data of the same class
+ svm_group_classes(prob,tmp_nr_class,tmp_label,tmp_start,tmp_count,perm);
+ int nr_class = tmp_nr_class[0];
+ int[] label = tmp_label[0];
+ int[] start = tmp_start[0];
+ int[] count = tmp_count[0];
+
+ if(nr_class == 1)
+ svm.info("WARNING: training data in only one class. See README for details.\n");
+
+ svm_node[][] x = new svm_node[l][];
+ int i;
+ for(i=0;i 0)
+ nonzero[si+k] = true;
+ for(k=0;k 0)
+ nonzero[sj+k] = true;
+ ++p;
+ }
+
+ // build output
+
+ model.nr_class = nr_class;
+
+ model.label = new int[nr_class];
+ for(i=0;i some folds may have zero elements
+ if((param.svm_type == svm_parameter.C_SVC ||
+ param.svm_type == svm_parameter.NU_SVC) && nr_fold < l)
+ {
+ int[] tmp_nr_class = new int[1];
+ int[][] tmp_label = new int[1][];
+ int[][] tmp_start = new int[1][];
+ int[][] tmp_count = new int[1][];
+
+ svm_group_classes(prob,tmp_nr_class,tmp_label,tmp_start,tmp_count,perm);
+
+ int nr_class = tmp_nr_class[0];
+ int[] start = tmp_start[0];
+ int[] count = tmp_count[0];
+
+ // random shuffle and then data grouped by fold using the array perm
+ int[] fold_count = new int[nr_fold];
+ int c;
+ int[] index = new int[l];
+ for(i=0;i0)?1:-1;
+ else
+ return sum;
+ }
+ else
+ {
+ int nr_class = model.nr_class;
+ int l = model.l;
+
+ double[] kvalue = new double[l];
+ for(i=0;i 0)
+ ++vote[i];
+ else
+ ++vote[j];
+ p++;
+ }
+
+ int vote_max_idx = 0;
+ for(i=1;i vote[vote_max_idx])
+ vote_max_idx = i;
+
+ return model.label[vote_max_idx];
+ }
+ }
+
+ public static double svm_predict(svm_model model, svm_node[] x)
+ {
+ int nr_class = model.nr_class;
+ double[] dec_values;
+ if(model.param.svm_type == svm_parameter.ONE_CLASS ||
+ model.param.svm_type == svm_parameter.EPSILON_SVR ||
+ model.param.svm_type == svm_parameter.NU_SVR)
+ dec_values = new double[1];
+ else
+ dec_values = new double[nr_class*(nr_class-1)/2];
+ double pred_result = svm_predict_values(model, x, dec_values);
+ return pred_result;
+ }
+
+ public static double svm_predict_probability(svm_model model, svm_node[] x, double[] prob_estimates)
+ {
+ if ((model.param.svm_type == svm_parameter.C_SVC || model.param.svm_type == svm_parameter.NU_SVC) &&
+ model.probA!=null && model.probB!=null)
+ {
+ int i;
+ int nr_class = model.nr_class;
+ double[] dec_values = new double[nr_class*(nr_class-1)/2];
+ svm_predict_values(model, x, dec_values);
+
+ double min_prob=1e-7;
+ double[][] pairwise_prob=new double[nr_class][nr_class];
+
+ int k=0;
+ for(i=0;i prob_estimates[prob_max_idx])
+ prob_max_idx = i;
+ return model.label[prob_max_idx];
+ }
+ else
+ return svm_predict(model, x);
+ }
+
+ static final String svm_type_table[] =
+ {
+ "c_svc","nu_svc","one_class","epsilon_svr","nu_svr",
+ };
+
+ static final String kernel_type_table[]=
+ {
+ "linear","polynomial","rbf","sigmoid","precomputed"
+ };
+
+ public static void svm_save_model(String model_file_name, svm_model model) throws IOException
+ {
+ DataOutputStream fp = new DataOutputStream(new BufferedOutputStream(new FileOutputStream(model_file_name)));
+
+ svm_parameter param = model.param;
+
+ fp.writeBytes("svm_type "+svm_type_table[param.svm_type]+"\n");
+ fp.writeBytes("kernel_type "+kernel_type_table[param.kernel_type]+"\n");
+
+ if(param.kernel_type == svm_parameter.POLY)
+ fp.writeBytes("degree "+param.degree+"\n");
+
+ if(param.kernel_type == svm_parameter.POLY ||
+ param.kernel_type == svm_parameter.RBF ||
+ param.kernel_type == svm_parameter.SIGMOID)
+ fp.writeBytes("gamma "+param.gamma+"\n");
+
+ if(param.kernel_type == svm_parameter.POLY ||
+ param.kernel_type == svm_parameter.SIGMOID)
+ fp.writeBytes("coef0 "+param.coef0+"\n");
+
+ int nr_class = model.nr_class;
+ int l = model.l;
+ fp.writeBytes("nr_class "+nr_class+"\n");
+ fp.writeBytes("total_sv "+l+"\n");
+
+ {
+ fp.writeBytes("rho");
+ for(int i=0;i 1)
+ return "nu <= 0 or nu > 1";
+
+ if(svm_type == svm_parameter.EPSILON_SVR)
+ if(param.p < 0)
+ return "p < 0";
+
+ if(param.shrinking != 0 &&
+ param.shrinking != 1)
+ return "shrinking != 0 and shrinking != 1";
+
+ if(param.probability != 0 &&
+ param.probability != 1)
+ return "probability != 0 and probability != 1";
+
+ if(param.probability == 1 &&
+ svm_type == svm_parameter.ONE_CLASS)
+ return "one-class SVM probability output not supported yet";
+
+ // check whether nu-svc is feasible
+
+ if(svm_type == svm_parameter.NU_SVC)
+ {
+ int l = prob.l;
+ int max_nr_class = 16;
+ int nr_class = 0;
+ int[] label = new int[max_nr_class];
+ int[] count = new int[max_nr_class];
+
+ int i;
+ for(i=0;i Math.min(n1,n2))
+ return "specified nu is infeasible";
+ }
+ }
+ }
+
+ return null;
+ }
+
+ public static int svm_check_probability_model(svm_model model)
+ {
+ if (((model.param.svm_type == svm_parameter.C_SVC || model.param.svm_type == svm_parameter.NU_SVC) &&
+ model.probA!=null && model.probB!=null) ||
+ ((model.param.svm_type == svm_parameter.EPSILON_SVR || model.param.svm_type == svm_parameter.NU_SVR) &&
+ model.probA!=null))
+ return 1;
+ else
+ return 0;
+ }
+
+ public static void svm_set_print_string_function(svm_print_interface print_func)
+ {
+ if (print_func == null)
+ svm_print_string = svm_print_stdout;
+ else
+ svm_print_string = print_func;
+ }
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_model.java b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_model.java
new file mode 100644
index 0000000..a38be3f
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_model.java
@@ -0,0 +1,22 @@
+//
+// svm_model
+//
+package libsvm;
+public class svm_model implements java.io.Serializable
+{
+ public svm_parameter param; // parameter
+ public int nr_class; // number of classes, = 2 in regression/one class svm
+ public int l; // total #SV
+ public svm_node[][] SV; // SVs (SV[l])
+ public double[][] sv_coef; // coefficients for SVs in decision functions (sv_coef[k-1][l])
+ public double[] rho; // constants in decision functions (rho[k*(k-1)/2])
+ public double[] probA; // pariwise probability information
+ public double[] probB;
+ public int[] sv_indices; // sv_indices[0,...,nSV-1] are values in [1,...,num_traning_data] to indicate SVs in the training set
+
+ // for classification only
+
+ public int[] label; // label of each class (label[k])
+ public int[] nSV; // number of SVs for each class (nSV[k])
+ // nSV[0] + nSV[1] + ... + nSV[k-1] = l
+};
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_node.java b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_node.java
new file mode 100644
index 0000000..9ab0a10
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_node.java
@@ -0,0 +1,6 @@
+package libsvm;
+public class svm_node implements java.io.Serializable
+{
+ public int index;
+ public double value;
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_parameter.java b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_parameter.java
new file mode 100644
index 0000000..429f041
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_parameter.java
@@ -0,0 +1,47 @@
+package libsvm;
+public class svm_parameter implements Cloneable,java.io.Serializable
+{
+ /* svm_type */
+ public static final int C_SVC = 0;
+ public static final int NU_SVC = 1;
+ public static final int ONE_CLASS = 2;
+ public static final int EPSILON_SVR = 3;
+ public static final int NU_SVR = 4;
+
+ /* kernel_type */
+ public static final int LINEAR = 0;
+ public static final int POLY = 1;
+ public static final int RBF = 2;
+ public static final int SIGMOID = 3;
+ public static final int PRECOMPUTED = 4;
+
+ public int svm_type;
+ public int kernel_type;
+ public int degree; // for poly
+ public double gamma; // for poly/rbf/sigmoid
+ public double coef0; // for poly/sigmoid
+
+ // these are for training only
+ public double cache_size; // in MB
+ public double eps; // stopping criteria
+ public double C; // for C_SVC, EPSILON_SVR and NU_SVR
+ public int nr_weight; // for C_SVC
+ public int[] weight_label; // for C_SVC
+ public double[] weight; // for C_SVC
+ public double nu; // for NU_SVC, ONE_CLASS, and NU_SVR
+ public double p; // for EPSILON_SVR
+ public int shrinking; // use the shrinking heuristics
+ public int probability; // do probability estimates
+
+ public Object clone()
+ {
+ try
+ {
+ return super.clone();
+ } catch (CloneNotSupportedException e)
+ {
+ return null;
+ }
+ }
+
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_print_interface.java b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_print_interface.java
new file mode 100644
index 0000000..ff4d0e8
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_print_interface.java
@@ -0,0 +1,5 @@
+package libsvm;
+public interface svm_print_interface
+{
+ public void print(String s);
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_problem.java b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_problem.java
new file mode 100644
index 0000000..5d74609
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/libsvm/svm_problem.java
@@ -0,0 +1,7 @@
+package libsvm;
+public class svm_problem implements java.io.Serializable
+{
+ public int l;
+ public double[] y;
+ public svm_node[][] x;
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/svm_predict.java b/src/backend/app/algorithms/evaluate/libsvm/java/svm_predict.java
new file mode 100644
index 0000000..d714c5b
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/svm_predict.java
@@ -0,0 +1,194 @@
+import libsvm.*;
+import java.io.*;
+import java.util.*;
+
+class svm_predict {
+ private static svm_print_interface svm_print_null = new svm_print_interface()
+ {
+ public void print(String s) {}
+ };
+
+ private static svm_print_interface svm_print_stdout = new svm_print_interface()
+ {
+ public void print(String s)
+ {
+ System.out.print(s);
+ }
+ };
+
+ private static svm_print_interface svm_print_string = svm_print_stdout;
+
+ static void info(String s)
+ {
+ svm_print_string.print(s);
+ }
+
+ private static double atof(String s)
+ {
+ return Double.valueOf(s).doubleValue();
+ }
+
+ private static int atoi(String s)
+ {
+ return Integer.parseInt(s);
+ }
+
+ private static void predict(BufferedReader input, DataOutputStream output, svm_model model, int predict_probability) throws IOException
+ {
+ int correct = 0;
+ int total = 0;
+ double error = 0;
+ double sumv = 0, sumy = 0, sumvv = 0, sumyy = 0, sumvy = 0;
+
+ int svm_type=svm.svm_get_svm_type(model);
+ int nr_class=svm.svm_get_nr_class(model);
+ double[] prob_estimates=null;
+
+ if(predict_probability == 1)
+ {
+ if(svm_type == svm_parameter.EPSILON_SVR ||
+ svm_type == svm_parameter.NU_SVR)
+ {
+ svm_predict.info("Prob. model for test data: target value = predicted value + z,\nz: Laplace distribution e^(-|z|/sigma)/(2sigma),sigma="+svm.svm_get_svr_probability(model)+"\n");
+ }
+ else
+ {
+ int[] labels=new int[nr_class];
+ svm.svm_get_labels(model,labels);
+ prob_estimates = new double[nr_class];
+ output.writeBytes("labels");
+ for(int j=0;j=argv.length-2)
+ exit_with_help();
+ try
+ {
+ BufferedReader input = new BufferedReader(new FileReader(argv[i]));
+ DataOutputStream output = new DataOutputStream(new BufferedOutputStream(new FileOutputStream(argv[i+2])));
+ svm_model model = svm.svm_load_model(argv[i+1]);
+ if (model == null)
+ {
+ System.err.print("can't open model file "+argv[i+1]+"\n");
+ System.exit(1);
+ }
+ if(predict_probability == 1)
+ {
+ if(svm.svm_check_probability_model(model)==0)
+ {
+ System.err.print("Model does not support probabiliy estimates\n");
+ System.exit(1);
+ }
+ }
+ else
+ {
+ if(svm.svm_check_probability_model(model)!=0)
+ {
+ svm_predict.info("Model supports probability estimates, but disabled in prediction.\n");
+ }
+ }
+ predict(input,output,model,predict_probability);
+ input.close();
+ output.close();
+ }
+ catch(FileNotFoundException e)
+ {
+ exit_with_help();
+ }
+ catch(ArrayIndexOutOfBoundsException e)
+ {
+ exit_with_help();
+ }
+ }
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/svm_scale.java b/src/backend/app/algorithms/evaluate/libsvm/java/svm_scale.java
new file mode 100644
index 0000000..6e8d458
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/svm_scale.java
@@ -0,0 +1,350 @@
+import libsvm.*;
+import java.io.*;
+import java.util.*;
+import java.text.DecimalFormat;
+
+class svm_scale
+{
+ private String line = null;
+ private double lower = -1.0;
+ private double upper = 1.0;
+ private double y_lower;
+ private double y_upper;
+ private boolean y_scaling = false;
+ private double[] feature_max;
+ private double[] feature_min;
+ private double y_max = -Double.MAX_VALUE;
+ private double y_min = Double.MAX_VALUE;
+ private int max_index;
+ private long num_nonzeros = 0;
+ private long new_num_nonzeros = 0;
+
+ private static void exit_with_help()
+ {
+ System.out.print(
+ "Usage: svm-scale [options] data_filename\n"
+ +"options:\n"
+ +"-l lower : x scaling lower limit (default -1)\n"
+ +"-u upper : x scaling upper limit (default +1)\n"
+ +"-y y_lower y_upper : y scaling limits (default: no y scaling)\n"
+ +"-s save_filename : save scaling parameters to save_filename\n"
+ +"-r restore_filename : restore scaling parameters from restore_filename\n"
+ );
+ System.exit(1);
+ }
+
+ private BufferedReader rewind(BufferedReader fp, String filename) throws IOException
+ {
+ fp.close();
+ return new BufferedReader(new FileReader(filename));
+ }
+
+ private void output_target(double value)
+ {
+ if(y_scaling)
+ {
+ if(value == y_min)
+ value = y_lower;
+ else if(value == y_max)
+ value = y_upper;
+ else
+ value = y_lower + (y_upper-y_lower) *
+ (value-y_min) / (y_max-y_min);
+ }
+
+ System.out.print(value + " ");
+ }
+
+ private void output(int index, double value)
+ {
+ /* skip single-valued attribute */
+ if(feature_max[index] == feature_min[index])
+ return;
+
+ if(value == feature_min[index])
+ value = lower;
+ else if(value == feature_max[index])
+ value = upper;
+ else
+ value = lower + (upper-lower) *
+ (value-feature_min[index])/
+ (feature_max[index]-feature_min[index]);
+
+ if(value != 0)
+ {
+ System.out.print(index + ":" + value + " ");
+ new_num_nonzeros++;
+ }
+ }
+
+ private String readline(BufferedReader fp) throws IOException
+ {
+ line = fp.readLine();
+ return line;
+ }
+
+ private void run(String []argv) throws IOException
+ {
+ int i,index;
+ BufferedReader fp = null, fp_restore = null;
+ String save_filename = null;
+ String restore_filename = null;
+ String data_filename = null;
+
+
+ for(i=0;i lower) || (y_scaling && !(y_upper > y_lower)))
+ {
+ System.err.println("inconsistent lower/upper specification");
+ System.exit(1);
+ }
+ if(restore_filename != null && save_filename != null)
+ {
+ System.err.println("cannot use -r and -s simultaneously");
+ System.exit(1);
+ }
+
+ if(argv.length != i+1)
+ exit_with_help();
+
+ data_filename = argv[i];
+ try {
+ fp = new BufferedReader(new FileReader(data_filename));
+ } catch (Exception e) {
+ System.err.println("can't open file " + data_filename);
+ System.exit(1);
+ }
+
+ /* assumption: min index of attributes is 1 */
+ /* pass 1: find out max index of attributes */
+ max_index = 0;
+
+ if(restore_filename != null)
+ {
+ int idx, c;
+
+ try {
+ fp_restore = new BufferedReader(new FileReader(restore_filename));
+ }
+ catch (Exception e) {
+ System.err.println("can't open file " + restore_filename);
+ System.exit(1);
+ }
+ if((c = fp_restore.read()) == 'y')
+ {
+ fp_restore.readLine();
+ fp_restore.readLine();
+ fp_restore.readLine();
+ }
+ fp_restore.readLine();
+ fp_restore.readLine();
+
+ String restore_line = null;
+ while((restore_line = fp_restore.readLine())!=null)
+ {
+ StringTokenizer st2 = new StringTokenizer(restore_line);
+ idx = Integer.parseInt(st2.nextToken());
+ max_index = Math.max(max_index, idx);
+ }
+ fp_restore = rewind(fp_restore, restore_filename);
+ }
+
+ while (readline(fp) != null)
+ {
+ StringTokenizer st = new StringTokenizer(line," \t\n\r\f:");
+ st.nextToken();
+ while(st.hasMoreTokens())
+ {
+ index = Integer.parseInt(st.nextToken());
+ max_index = Math.max(max_index, index);
+ st.nextToken();
+ num_nonzeros++;
+ }
+ }
+
+ try {
+ feature_max = new double[(max_index+1)];
+ feature_min = new double[(max_index+1)];
+ } catch(OutOfMemoryError e) {
+ System.err.println("can't allocate enough memory");
+ System.exit(1);
+ }
+
+ for(i=0;i<=max_index;i++)
+ {
+ feature_max[i] = -Double.MAX_VALUE;
+ feature_min[i] = Double.MAX_VALUE;
+ }
+
+ fp = rewind(fp, data_filename);
+
+ /* pass 2: find out min/max value */
+ while(readline(fp) != null)
+ {
+ int next_index = 1;
+ double target;
+ double value;
+
+ StringTokenizer st = new StringTokenizer(line," \t\n\r\f:");
+ target = Double.parseDouble(st.nextToken());
+ y_max = Math.max(y_max, target);
+ y_min = Math.min(y_min, target);
+
+ while (st.hasMoreTokens())
+ {
+ index = Integer.parseInt(st.nextToken());
+ value = Double.parseDouble(st.nextToken());
+
+ for (i = next_index; i num_nonzeros)
+ System.err.print(
+ "WARNING: original #nonzeros " + num_nonzeros+"\n"
+ +" new #nonzeros " + new_num_nonzeros+"\n"
+ +"Use -l 0 if many original feature values are zeros\n");
+
+ fp.close();
+ }
+
+ public static void main(String argv[]) throws IOException
+ {
+ svm_scale s = new svm_scale();
+ s.run(argv);
+ }
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/svm_toy.java b/src/backend/app/algorithms/evaluate/libsvm/java/svm_toy.java
new file mode 100644
index 0000000..c4bd503
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/svm_toy.java
@@ -0,0 +1,502 @@
+import libsvm.*;
+import java.applet.*;
+import java.awt.*;
+import java.util.*;
+import java.awt.event.*;
+import java.io.*;
+
+public class svm_toy extends Applet {
+
+ static final String DEFAULT_PARAM="-t 2 -c 100";
+ int XLEN;
+ int YLEN;
+
+ // off-screen buffer
+
+ Image buffer;
+ Graphics buffer_gc;
+
+ // pre-allocated colors
+
+ final static Color colors[] =
+ {
+ new Color(0,0,0),
+ new Color(0,120,120),
+ new Color(120,120,0),
+ new Color(120,0,120),
+ new Color(0,200,200),
+ new Color(200,200,0),
+ new Color(200,0,200)
+ };
+
+ class point {
+ point(double x, double y, byte value)
+ {
+ this.x = x;
+ this.y = y;
+ this.value = value;
+ }
+ double x, y;
+ byte value;
+ }
+
+ Vector point_list = new Vector();
+ byte current_value = 1;
+
+ public void init()
+ {
+ setSize(getSize());
+
+ final Button button_change = new Button("Change");
+ Button button_run = new Button("Run");
+ Button button_clear = new Button("Clear");
+ Button button_save = new Button("Save");
+ Button button_load = new Button("Load");
+ final TextField input_line = new TextField(DEFAULT_PARAM);
+
+ BorderLayout layout = new BorderLayout();
+ this.setLayout(layout);
+
+ Panel p = new Panel();
+ GridBagLayout gridbag = new GridBagLayout();
+ p.setLayout(gridbag);
+
+ GridBagConstraints c = new GridBagConstraints();
+ c.fill = GridBagConstraints.HORIZONTAL;
+ c.weightx = 1;
+ c.gridwidth = 1;
+ gridbag.setConstraints(button_change,c);
+ gridbag.setConstraints(button_run,c);
+ gridbag.setConstraints(button_clear,c);
+ gridbag.setConstraints(button_save,c);
+ gridbag.setConstraints(button_load,c);
+ c.weightx = 5;
+ c.gridwidth = 5;
+ gridbag.setConstraints(input_line,c);
+
+ button_change.setBackground(colors[current_value]);
+
+ p.add(button_change);
+ p.add(button_run);
+ p.add(button_clear);
+ p.add(button_save);
+ p.add(button_load);
+ p.add(input_line);
+ this.add(p,BorderLayout.SOUTH);
+
+ button_change.addActionListener(new ActionListener()
+ { public void actionPerformed (ActionEvent e)
+ { button_change_clicked(); button_change.setBackground(colors[current_value]); }});
+
+ button_run.addActionListener(new ActionListener()
+ { public void actionPerformed (ActionEvent e)
+ { button_run_clicked(input_line.getText()); }});
+
+ button_clear.addActionListener(new ActionListener()
+ { public void actionPerformed (ActionEvent e)
+ { button_clear_clicked(); }});
+
+ button_save.addActionListener(new ActionListener()
+ { public void actionPerformed (ActionEvent e)
+ { button_save_clicked(input_line.getText()); }});
+
+ button_load.addActionListener(new ActionListener()
+ { public void actionPerformed (ActionEvent e)
+ { button_load_clicked(); }});
+
+ input_line.addActionListener(new ActionListener()
+ { public void actionPerformed (ActionEvent e)
+ { button_run_clicked(input_line.getText()); }});
+
+ this.enableEvents(AWTEvent.MOUSE_EVENT_MASK);
+ }
+
+ void draw_point(point p)
+ {
+ Color c = colors[p.value+3];
+
+ Graphics window_gc = getGraphics();
+ buffer_gc.setColor(c);
+ buffer_gc.fillRect((int)(p.x*XLEN),(int)(p.y*YLEN),4,4);
+ window_gc.setColor(c);
+ window_gc.fillRect((int)(p.x*XLEN),(int)(p.y*YLEN),4,4);
+ }
+
+ void clear_all()
+ {
+ point_list.removeAllElements();
+ if(buffer != null)
+ {
+ buffer_gc.setColor(colors[0]);
+ buffer_gc.fillRect(0,0,XLEN,YLEN);
+ }
+ repaint();
+ }
+
+ void draw_all_points()
+ {
+ int n = point_list.size();
+ for(int i=0;i 3) current_value = 1;
+ }
+
+ private static double atof(String s)
+ {
+ return Double.valueOf(s).doubleValue();
+ }
+
+ private static int atoi(String s)
+ {
+ return Integer.parseInt(s);
+ }
+
+ void button_run_clicked(String args)
+ {
+ // guard
+ if(point_list.isEmpty()) return;
+
+ svm_parameter param = new svm_parameter();
+
+ // default values
+ param.svm_type = svm_parameter.C_SVC;
+ param.kernel_type = svm_parameter.RBF;
+ param.degree = 3;
+ param.gamma = 0;
+ param.coef0 = 0;
+ param.nu = 0.5;
+ param.cache_size = 40;
+ param.C = 1;
+ param.eps = 1e-3;
+ param.p = 0.1;
+ param.shrinking = 1;
+ param.probability = 0;
+ param.nr_weight = 0;
+ param.weight_label = new int[0];
+ param.weight = new double[0];
+
+ // parse options
+ StringTokenizer st = new StringTokenizer(args);
+ String[] argv = new String[st.countTokens()];
+ for(int i=0;i=argv.length)
+ {
+ System.err.print("unknown option\n");
+ break;
+ }
+ switch(argv[i-1].charAt(1))
+ {
+ case 's':
+ param.svm_type = atoi(argv[i]);
+ break;
+ case 't':
+ param.kernel_type = atoi(argv[i]);
+ break;
+ case 'd':
+ param.degree = atoi(argv[i]);
+ break;
+ case 'g':
+ param.gamma = atof(argv[i]);
+ break;
+ case 'r':
+ param.coef0 = atof(argv[i]);
+ break;
+ case 'n':
+ param.nu = atof(argv[i]);
+ break;
+ case 'm':
+ param.cache_size = atof(argv[i]);
+ break;
+ case 'c':
+ param.C = atof(argv[i]);
+ break;
+ case 'e':
+ param.eps = atof(argv[i]);
+ break;
+ case 'p':
+ param.p = atof(argv[i]);
+ break;
+ case 'h':
+ param.shrinking = atoi(argv[i]);
+ break;
+ case 'b':
+ param.probability = atoi(argv[i]);
+ break;
+ case 'w':
+ ++param.nr_weight;
+ {
+ int[] old = param.weight_label;
+ param.weight_label = new int[param.nr_weight];
+ System.arraycopy(old,0,param.weight_label,0,param.nr_weight-1);
+ }
+
+ {
+ double[] old = param.weight;
+ param.weight = new double[param.nr_weight];
+ System.arraycopy(old,0,param.weight,0,param.nr_weight-1);
+ }
+
+ param.weight_label[param.nr_weight-1] = atoi(argv[i-1].substring(2));
+ param.weight[param.nr_weight-1] = atof(argv[i]);
+ break;
+ default:
+ System.err.print("unknown option\n");
+ }
+ }
+
+ // build problem
+ svm_problem prob = new svm_problem();
+ prob.l = point_list.size();
+ prob.y = new double[prob.l];
+
+ if(param.kernel_type == svm_parameter.PRECOMPUTED)
+ {
+ }
+ else if(param.svm_type == svm_parameter.EPSILON_SVR ||
+ param.svm_type == svm_parameter.NU_SVR)
+ {
+ if(param.gamma == 0) param.gamma = 1;
+ prob.x = new svm_node[prob.l][1];
+ for(int i=0;i= XLEN || e.getY() >= YLEN) return;
+ point p = new point((double)e.getX()/XLEN,
+ (double)e.getY()/YLEN,
+ current_value);
+ point_list.addElement(p);
+ draw_point(p);
+ }
+ }
+
+ public void paint(Graphics g)
+ {
+ // create buffer first time
+ if(buffer == null) {
+ buffer = this.createImage(XLEN,YLEN);
+ buffer_gc = buffer.getGraphics();
+ buffer_gc.setColor(colors[0]);
+ buffer_gc.fillRect(0,0,XLEN,YLEN);
+ }
+ g.drawImage(buffer,0,0,this);
+ }
+
+ public Dimension getPreferredSize() { return new Dimension(XLEN,YLEN+50); }
+
+ public void setSize(Dimension d) { setSize(d.width,d.height); }
+ public void setSize(int w,int h) {
+ super.setSize(w,h);
+ XLEN = w;
+ YLEN = h-50;
+ clear_all();
+ }
+
+ public static void main(String[] argv)
+ {
+ new AppletFrame("svm_toy",new svm_toy(),500,500+50);
+ }
+}
+
+class AppletFrame extends Frame {
+ AppletFrame(String title, Applet applet, int width, int height)
+ {
+ super(title);
+ this.addWindowListener(new WindowAdapter() {
+ public void windowClosing(WindowEvent e) {
+ System.exit(0);
+ }
+ });
+ applet.init();
+ applet.setSize(width,height);
+ applet.start();
+ this.add(applet);
+ this.pack();
+ this.setVisible(true);
+ }
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/svm_train.java b/src/backend/app/algorithms/evaluate/libsvm/java/svm_train.java
new file mode 100644
index 0000000..22ee043
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/svm_train.java
@@ -0,0 +1,318 @@
+import libsvm.*;
+import java.io.*;
+import java.util.*;
+
+class svm_train {
+ private svm_parameter param; // set by parse_command_line
+ private svm_problem prob; // set by read_problem
+ private svm_model model;
+ private String input_file_name; // set by parse_command_line
+ private String model_file_name; // set by parse_command_line
+ private String error_msg;
+ private int cross_validation;
+ private int nr_fold;
+
+ private static svm_print_interface svm_print_null = new svm_print_interface()
+ {
+ public void print(String s) {}
+ };
+
+ private static void exit_with_help()
+ {
+ System.out.print(
+ "Usage: svm_train [options] training_set_file [model_file]\n"
+ +"options:\n"
+ +"-s svm_type : set type of SVM (default 0)\n"
+ +" 0 -- C-SVC (multi-class classification)\n"
+ +" 1 -- nu-SVC (multi-class classification)\n"
+ +" 2 -- one-class SVM\n"
+ +" 3 -- epsilon-SVR (regression)\n"
+ +" 4 -- nu-SVR (regression)\n"
+ +"-t kernel_type : set type of kernel function (default 2)\n"
+ +" 0 -- linear: u'*v\n"
+ +" 1 -- polynomial: (gamma*u'*v + coef0)^degree\n"
+ +" 2 -- radial basis function: exp(-gamma*|u-v|^2)\n"
+ +" 3 -- sigmoid: tanh(gamma*u'*v + coef0)\n"
+ +" 4 -- precomputed kernel (kernel values in training_set_file)\n"
+ +"-d degree : set degree in kernel function (default 3)\n"
+ +"-g gamma : set gamma in kernel function (default 1/num_features)\n"
+ +"-r coef0 : set coef0 in kernel function (default 0)\n"
+ +"-c cost : set the parameter C of C-SVC, epsilon-SVR, and nu-SVR (default 1)\n"
+ +"-n nu : set the parameter nu of nu-SVC, one-class SVM, and nu-SVR (default 0.5)\n"
+ +"-p epsilon : set the epsilon in loss function of epsilon-SVR (default 0.1)\n"
+ +"-m cachesize : set cache memory size in MB (default 100)\n"
+ +"-e epsilon : set tolerance of termination criterion (default 0.001)\n"
+ +"-h shrinking : whether to use the shrinking heuristics, 0 or 1 (default 1)\n"
+ +"-b probability_estimates : whether to train a SVC or SVR model for probability estimates, 0 or 1 (default 0)\n"
+ +"-wi weight : set the parameter C of class i to weight*C, for C-SVC (default 1)\n"
+ +"-v n : n-fold cross validation mode\n"
+ +"-q : quiet mode (no outputs)\n"
+ );
+ System.exit(1);
+ }
+
+ private void do_cross_validation()
+ {
+ int i;
+ int total_correct = 0;
+ double total_error = 0;
+ double sumv = 0, sumy = 0, sumvv = 0, sumyy = 0, sumvy = 0;
+ double[] target = new double[prob.l];
+
+ svm.svm_cross_validation(prob,param,nr_fold,target);
+ if(param.svm_type == svm_parameter.EPSILON_SVR ||
+ param.svm_type == svm_parameter.NU_SVR)
+ {
+ for(i=0;i=argv.length)
+ exit_with_help();
+ switch(argv[i-1].charAt(1))
+ {
+ case 's':
+ param.svm_type = atoi(argv[i]);
+ break;
+ case 't':
+ param.kernel_type = atoi(argv[i]);
+ break;
+ case 'd':
+ param.degree = atoi(argv[i]);
+ break;
+ case 'g':
+ param.gamma = atof(argv[i]);
+ break;
+ case 'r':
+ param.coef0 = atof(argv[i]);
+ break;
+ case 'n':
+ param.nu = atof(argv[i]);
+ break;
+ case 'm':
+ param.cache_size = atof(argv[i]);
+ break;
+ case 'c':
+ param.C = atof(argv[i]);
+ break;
+ case 'e':
+ param.eps = atof(argv[i]);
+ break;
+ case 'p':
+ param.p = atof(argv[i]);
+ break;
+ case 'h':
+ param.shrinking = atoi(argv[i]);
+ break;
+ case 'b':
+ param.probability = atoi(argv[i]);
+ break;
+ case 'q':
+ print_func = svm_print_null;
+ i--;
+ break;
+ case 'v':
+ cross_validation = 1;
+ nr_fold = atoi(argv[i]);
+ if(nr_fold < 2)
+ {
+ System.err.print("n-fold cross validation: n must >= 2\n");
+ exit_with_help();
+ }
+ break;
+ case 'w':
+ ++param.nr_weight;
+ {
+ int[] old = param.weight_label;
+ param.weight_label = new int[param.nr_weight];
+ System.arraycopy(old,0,param.weight_label,0,param.nr_weight-1);
+ }
+
+ {
+ double[] old = param.weight;
+ param.weight = new double[param.nr_weight];
+ System.arraycopy(old,0,param.weight,0,param.nr_weight-1);
+ }
+
+ param.weight_label[param.nr_weight-1] = atoi(argv[i-1].substring(2));
+ param.weight[param.nr_weight-1] = atof(argv[i]);
+ break;
+ default:
+ System.err.print("Unknown option: " + argv[i-1] + "\n");
+ exit_with_help();
+ }
+ }
+
+ svm.svm_set_print_string_function(print_func);
+
+ // determine filenames
+
+ if(i>=argv.length)
+ exit_with_help();
+
+ input_file_name = argv[i];
+
+ if(i vy = new Vector();
+ Vector vx = new Vector();
+ int max_index = 0;
+
+ while(true)
+ {
+ String line = fp.readLine();
+ if(line == null) break;
+
+ StringTokenizer st = new StringTokenizer(line," \t\n\r\f:");
+
+ vy.addElement(atof(st.nextToken()));
+ int m = st.countTokens()/2;
+ svm_node[] x = new svm_node[m];
+ for(int j=0;j0) max_index = Math.max(max_index, x[m-1].index);
+ vx.addElement(x);
+ }
+
+ prob = new svm_problem();
+ prob.l = vy.size();
+ prob.x = new svm_node[prob.l][];
+ for(int i=0;i 0)
+ param.gamma = 1.0/max_index;
+
+ if(param.kernel_type == svm_parameter.PRECOMPUTED)
+ for(int i=0;i max_index)
+ {
+ System.err.print("Wrong input format: sample_serial_number out of range\n");
+ System.exit(1);
+ }
+ }
+
+ fp.close();
+ }
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/java/test_applet.html b/src/backend/app/algorithms/evaluate/libsvm/java/test_applet.html
new file mode 100644
index 0000000..7f40424
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/java/test_applet.html
@@ -0,0 +1 @@
+
diff --git a/src/backend/app/algorithms/evaluate/libsvm/libsvm.so.2 b/src/backend/app/algorithms/evaluate/libsvm/libsvm.so.2
new file mode 100644
index 0000000..9457e76
Binary files /dev/null and b/src/backend/app/algorithms/evaluate/libsvm/libsvm.so.2 differ
diff --git a/src/backend/app/algorithms/evaluate/libsvm/matlab/Makefile b/src/backend/app/algorithms/evaluate/libsvm/matlab/Makefile
new file mode 100644
index 0000000..6693494
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/matlab/Makefile
@@ -0,0 +1,45 @@
+# This Makefile is used under Linux
+
+MATLABDIR ?= /usr/local/matlab
+# for Mac
+# MATLABDIR ?= /opt/local/matlab
+
+CXX ?= g++
+#CXX = g++-4.1
+CFLAGS = -Wall -Wconversion -O3 -fPIC -I$(MATLABDIR)/extern/include -I..
+
+MEX = $(MATLABDIR)/bin/mex
+MEX_OPTION = CC="$(CXX)" CXX="$(CXX)" CFLAGS="$(CFLAGS)" CXXFLAGS="$(CFLAGS)"
+# comment the following line if you use MATLAB on 32-bit computer
+MEX_OPTION += -largeArrayDims
+MEX_EXT = $(shell $(MATLABDIR)/bin/mexext)
+
+all: matlab
+
+matlab: binary
+
+octave:
+ @echo "please type make under Octave"
+
+binary: svmpredict.$(MEX_EXT) svmtrain.$(MEX_EXT) libsvmread.$(MEX_EXT) libsvmwrite.$(MEX_EXT)
+
+svmpredict.$(MEX_EXT): svmpredict.c ../svm.h ../svm.o svm_model_matlab.o
+ $(MEX) $(MEX_OPTION) svmpredict.c ../svm.o svm_model_matlab.o
+
+svmtrain.$(MEX_EXT): svmtrain.c ../svm.h ../svm.o svm_model_matlab.o
+ $(MEX) $(MEX_OPTION) svmtrain.c ../svm.o svm_model_matlab.o
+
+libsvmread.$(MEX_EXT): libsvmread.c
+ $(MEX) $(MEX_OPTION) libsvmread.c
+
+libsvmwrite.$(MEX_EXT): libsvmwrite.c
+ $(MEX) $(MEX_OPTION) libsvmwrite.c
+
+svm_model_matlab.o: svm_model_matlab.c ../svm.h
+ $(CXX) $(CFLAGS) -c svm_model_matlab.c
+
+../svm.o: ../svm.cpp ../svm.h
+ make -C .. svm.o
+
+clean:
+ rm -f *~ *.o *.mex* *.obj ../svm.o
diff --git a/src/backend/app/algorithms/evaluate/libsvm/matlab/README b/src/backend/app/algorithms/evaluate/libsvm/matlab/README
new file mode 100644
index 0000000..ce1bcf8
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/matlab/README
@@ -0,0 +1,245 @@
+-----------------------------------------
+--- MATLAB/OCTAVE interface of LIBSVM ---
+-----------------------------------------
+
+Table of Contents
+=================
+
+- Introduction
+- Installation
+- Usage
+- Returned Model Structure
+- Other Utilities
+- Examples
+- Additional Information
+
+
+Introduction
+============
+
+This tool provides a simple interface to LIBSVM, a library for support vector
+machines (http://www.csie.ntu.edu.tw/~cjlin/libsvm). It is very easy to use as
+the usage and the way of specifying parameters are the same as that of LIBSVM.
+
+Installation
+============
+
+On Windows systems, pre-built binary files are already in the
+directory '..\windows', so no need to conduct installation. Now we
+provide binary files only for 64bit MATLAB on Windows. If you would
+like to re-build the package, please rely on the following steps.
+
+We recommend using make.m on both MATLAB and OCTAVE. Just type 'make'
+to build 'libsvmread.mex', 'libsvmwrite.mex', 'svmtrain.mex', and
+'svmpredict.mex'.
+
+On MATLAB or Octave:
+
+ >> make
+
+If make.m does not work on MATLAB (especially for Windows), try 'mex
+-setup' to choose a suitable compiler for mex. Make sure your compiler
+is accessible and workable. Then type 'make' to start the
+installation.
+
+Example:
+
+ matlab>> mex -setup
+ (ps: MATLAB will show the following messages to setup default compiler.)
+ Please choose your compiler for building external interface (MEX) files:
+ Would you like mex to locate installed compilers [y]/n? y
+ Select a compiler:
+ [1] Microsoft Visual C/C++ version 7.1 in C:\Program Files\Microsoft Visual Studio
+ [0] None
+ Compiler: 1
+ Please verify your choices:
+ Compiler: Microsoft Visual C/C++ 7.1
+ Location: C:\Program Files\Microsoft Visual Studio
+ Are these correct?([y]/n): y
+
+ matlab>> make
+
+On Unix systems, if neither make.m nor 'mex -setup' works, please use
+Makefile and type 'make' in a command window. Note that we assume
+your MATLAB is installed in '/usr/local/matlab'. If not, please change
+MATLABDIR in Makefile.
+
+Example:
+ linux> make
+
+To use octave, type 'make octave':
+
+Example:
+ linux> make octave
+
+For a list of supported/compatible compilers for MATLAB, please check
+the following page:
+
+http://www.mathworks.com/support/compilers/current_release/
+
+Usage
+=====
+
+matlab> model = svmtrain(training_label_vector, training_instance_matrix [, 'libsvm_options']);
+
+ -training_label_vector:
+ An m by 1 vector of training labels (type must be double).
+ -training_instance_matrix:
+ An m by n matrix of m training instances with n features.
+ It can be dense or sparse (type must be double).
+ -libsvm_options:
+ A string of training options in the same format as that of LIBSVM.
+
+matlab> [predicted_label, accuracy, decision_values/prob_estimates] = svmpredict(testing_label_vector, testing_instance_matrix, model [, 'libsvm_options']);
+matlab> [predicted_label] = svmpredict(testing_label_vector, testing_instance_matrix, model [, 'libsvm_options']);
+
+ -testing_label_vector:
+ An m by 1 vector of prediction labels. If labels of test
+ data are unknown, simply use any random values. (type must be double)
+ -testing_instance_matrix:
+ An m by n matrix of m testing instances with n features.
+ It can be dense or sparse. (type must be double)
+ -model:
+ The output of svmtrain.
+ -libsvm_options:
+ A string of testing options in the same format as that of LIBSVM.
+
+Returned Model Structure
+========================
+
+The 'svmtrain' function returns a model which can be used for future
+prediction. It is a structure and is organized as [Parameters, nr_class,
+totalSV, rho, Label, ProbA, ProbB, nSV, sv_coef, SVs]:
+
+ -Parameters: parameters
+ -nr_class: number of classes; = 2 for regression/one-class svm
+ -totalSV: total #SV
+ -rho: -b of the decision function(s) wx+b
+ -Label: label of each class; empty for regression/one-class SVM
+ -sv_indices: values in [1,...,num_traning_data] to indicate SVs in the training set
+ -ProbA: pairwise probability information; empty if -b 0 or in one-class SVM
+ -ProbB: pairwise probability information; empty if -b 0 or in one-class SVM
+ -nSV: number of SVs for each class; empty for regression/one-class SVM
+ -sv_coef: coefficients for SVs in decision functions
+ -SVs: support vectors
+
+If you do not use the option '-b 1', ProbA and ProbB are empty
+matrices. If the '-v' option is specified, cross validation is
+conducted and the returned model is just a scalar: cross-validation
+accuracy for classification and mean-squared error for regression.
+
+More details about this model can be found in LIBSVM FAQ
+(http://www.csie.ntu.edu.tw/~cjlin/libsvm/faq.html) and LIBSVM
+implementation document
+(http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf).
+
+Result of Prediction
+====================
+
+The function 'svmpredict' has three outputs. The first one,
+predictd_label, is a vector of predicted labels. The second output,
+accuracy, is a vector including accuracy (for classification), mean
+squared error, and squared correlation coefficient (for regression).
+The third is a matrix containing decision values or probability
+estimates (if '-b 1' is specified). If k is the number of classes
+in training data, for decision values, each row includes results of
+predicting k(k-1)/2 binary-class SVMs. For classification, k = 1 is a
+special case. Decision value +1 is returned for each testing instance,
+instead of an empty vector. For probabilities, each row contains k values
+indicating the probability that the testing instance is in each class.
+Note that the order of classes here is the same as 'Label' field
+in the model structure.
+
+Other Utilities
+===============
+
+A matlab function libsvmread reads files in LIBSVM format:
+
+[label_vector, instance_matrix] = libsvmread('data.txt');
+
+Two outputs are labels and instances, which can then be used as inputs
+of svmtrain or svmpredict.
+
+A matlab function libsvmwrite writes Matlab matrix to a file in LIBSVM format:
+
+libsvmwrite('data.txt', label_vector, instance_matrix)
+
+The instance_matrix must be a sparse matrix. (type must be double)
+For 32bit and 64bit MATLAB on Windows, pre-built binary files are ready
+in the directory `..\windows', but in future releases, we will only
+include 64bit MATLAB binary files.
+
+These codes are prepared by Rong-En Fan and Kai-Wei Chang from National
+Taiwan University.
+
+Examples
+========
+
+Train and test on the provided data heart_scale:
+
+matlab> [heart_scale_label, heart_scale_inst] = libsvmread('../heart_scale');
+matlab> model = svmtrain(heart_scale_label, heart_scale_inst, '-c 1 -g 0.07');
+matlab> [predict_label, accuracy, dec_values] = svmpredict(heart_scale_label, heart_scale_inst, model); % test the training data
+
+For probability estimates, you need '-b 1' for training and testing:
+
+matlab> [heart_scale_label, heart_scale_inst] = libsvmread('../heart_scale');
+matlab> model = svmtrain(heart_scale_label, heart_scale_inst, '-c 1 -g 0.07 -b 1');
+matlab> [heart_scale_label, heart_scale_inst] = libsvmread('../heart_scale');
+matlab> [predict_label, accuracy, prob_estimates] = svmpredict(heart_scale_label, heart_scale_inst, model, '-b 1');
+
+To use precomputed kernel, you must include sample serial number as
+the first column of the training and testing data (assume your kernel
+matrix is K, # of instances is n):
+
+matlab> K1 = [(1:n)', K]; % include sample serial number as first column
+matlab> model = svmtrain(label_vector, K1, '-t 4');
+matlab> [predict_label, accuracy, dec_values] = svmpredict(label_vector, K1, model); % test the training data
+
+We give the following detailed example by splitting heart_scale into
+150 training and 120 testing data. Constructing a linear kernel
+matrix and then using the precomputed kernel gives exactly the same
+testing error as using the LIBSVM built-in linear kernel.
+
+matlab> [heart_scale_label, heart_scale_inst] = libsvmread('../heart_scale');
+matlab>
+matlab> % Split Data
+matlab> train_data = heart_scale_inst(1:150,:);
+matlab> train_label = heart_scale_label(1:150,:);
+matlab> test_data = heart_scale_inst(151:270,:);
+matlab> test_label = heart_scale_label(151:270,:);
+matlab>
+matlab> % Linear Kernel
+matlab> model_linear = svmtrain(train_label, train_data, '-t 0');
+matlab> [predict_label_L, accuracy_L, dec_values_L] = svmpredict(test_label, test_data, model_linear);
+matlab>
+matlab> % Precomputed Kernel
+matlab> model_precomputed = svmtrain(train_label, [(1:150)', train_data*train_data'], '-t 4');
+matlab> [predict_label_P, accuracy_P, dec_values_P] = svmpredict(test_label, [(1:120)', test_data*train_data'], model_precomputed);
+matlab>
+matlab> accuracy_L % Display the accuracy using linear kernel
+matlab> accuracy_P % Display the accuracy using precomputed kernel
+
+Note that for testing, you can put anything in the
+testing_label_vector. For more details of precomputed kernels, please
+read the section ``Precomputed Kernels'' in the README of the LIBSVM
+package.
+
+Additional Information
+======================
+
+This interface was initially written by Jun-Cheng Chen, Kuan-Jen Peng,
+Chih-Yuan Yang and Chih-Huai Cheng from Department of Computer
+Science, National Taiwan University. The current version was prepared
+by Rong-En Fan and Ting-Fan Wu. If you find this tool useful, please
+cite LIBSVM as follows
+
+Chih-Chung Chang and Chih-Jen Lin, LIBSVM : a library for support
+vector machines. ACM Transactions on Intelligent Systems and
+Technology, 2:27:1--27:27, 2011. Software available at
+http://www.csie.ntu.edu.tw/~cjlin/libsvm
+
+For any question, please contact Chih-Jen Lin ,
+or check the FAQ page:
+
+http://www.csie.ntu.edu.tw/~cjlin/libsvm/faq.html#/Q10:_MATLAB_interface
diff --git a/src/backend/app/algorithms/evaluate/libsvm/matlab/libsvmread.c b/src/backend/app/algorithms/evaluate/libsvm/matlab/libsvmread.c
new file mode 100644
index 0000000..d2fe0f5
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/matlab/libsvmread.c
@@ -0,0 +1,212 @@
+#include
+#include
+#include
+#include
+#include
+
+#include "mex.h"
+
+#ifdef MX_API_VER
+#if MX_API_VER < 0x07030000
+typedef int mwIndex;
+#endif
+#endif
+#ifndef max
+#define max(x,y) (((x)>(y))?(x):(y))
+#endif
+#ifndef min
+#define min(x,y) (((x)<(y))?(x):(y))
+#endif
+
+void exit_with_help()
+{
+ mexPrintf(
+ "Usage: [label_vector, instance_matrix] = libsvmread('filename');\n"
+ );
+}
+
+static void fake_answer(int nlhs, mxArray *plhs[])
+{
+ int i;
+ for(i=0;i start from 0
+ strtok(line," \t"); // label
+ while (1)
+ {
+ idx = strtok(NULL,":"); // index:value
+ val = strtok(NULL," \t");
+ if(val == NULL)
+ break;
+
+ errno = 0;
+ index = (int) strtol(idx,&endptr,10);
+ if(endptr == idx || errno != 0 || *endptr != '\0' || index <= inst_max_index)
+ {
+ mexPrintf("Wrong input format at line %d\n",l+1);
+ fake_answer(nlhs, plhs);
+ return;
+ }
+ else
+ inst_max_index = index;
+
+ min_index = min(min_index, index);
+ elements++;
+ }
+ max_index = max(max_index, inst_max_index);
+ l++;
+ }
+ rewind(fp);
+
+ // y
+ plhs[0] = mxCreateDoubleMatrix(l, 1, mxREAL);
+ // x^T
+ if (min_index <= 0)
+ plhs[1] = mxCreateSparse(max_index-min_index+1, l, elements, mxREAL);
+ else
+ plhs[1] = mxCreateSparse(max_index, l, elements, mxREAL);
+
+ labels = mxGetPr(plhs[0]);
+ samples = mxGetPr(plhs[1]);
+ ir = mxGetIr(plhs[1]);
+ jc = mxGetJc(plhs[1]);
+
+ k=0;
+ for(i=0;i start from 0
+
+ errno = 0;
+ samples[k] = strtod(val,&endptr);
+ if (endptr == val || errno != 0 || (*endptr != '\0' && !isspace(*endptr)))
+ {
+ mexPrintf("Wrong input format at line %d\n",i+1);
+ fake_answer(nlhs, plhs);
+ return;
+ }
+ ++k;
+ }
+ }
+ jc[l] = k;
+
+ fclose(fp);
+ free(line);
+
+ {
+ mxArray *rhs[1], *lhs[1];
+ rhs[0] = plhs[1];
+ if(mexCallMATLAB(1, lhs, 1, rhs, "transpose"))
+ {
+ mexPrintf("Error: cannot transpose problem\n");
+ fake_answer(nlhs, plhs);
+ return;
+ }
+ plhs[1] = lhs[0];
+ }
+}
+
+void mexFunction( int nlhs, mxArray *plhs[],
+ int nrhs, const mxArray *prhs[] )
+{
+ char filename[256];
+
+ if(nrhs != 1 || nlhs != 2)
+ {
+ exit_with_help();
+ fake_answer(nlhs, plhs);
+ return;
+ }
+
+ mxGetString(prhs[0], filename, mxGetN(prhs[0]) + 1);
+
+ if(filename == NULL)
+ {
+ mexPrintf("Error: filename is NULL\n");
+ return;
+ }
+
+ read_problem(filename, nlhs, plhs);
+
+ return;
+}
+
diff --git a/src/backend/app/algorithms/evaluate/libsvm/matlab/libsvmwrite.c b/src/backend/app/algorithms/evaluate/libsvm/matlab/libsvmwrite.c
new file mode 100644
index 0000000..9c93fd3
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/matlab/libsvmwrite.c
@@ -0,0 +1,119 @@
+#include
+#include
+#include
+#include "mex.h"
+
+#ifdef MX_API_VER
+#if MX_API_VER < 0x07030000
+typedef int mwIndex;
+#endif
+#endif
+
+void exit_with_help()
+{
+ mexPrintf(
+ "Usage: libsvmwrite('filename', label_vector, instance_matrix);\n"
+ );
+}
+
+static void fake_answer(int nlhs, mxArray *plhs[])
+{
+ int i;
+ for(i=0;i 0)
+ {
+ exit_with_help();
+ fake_answer(nlhs, plhs);
+ return;
+ }
+
+ // Transform the input Matrix to libsvm format
+ if(nrhs == 3)
+ {
+ char filename[256];
+ if(!mxIsDouble(prhs[1]) || !mxIsDouble(prhs[2]))
+ {
+ mexPrintf("Error: label vector and instance matrix must be double\n");
+ return;
+ }
+
+ mxGetString(prhs[0], filename, mxGetN(prhs[0])+1);
+
+ if(mxIsSparse(prhs[2]))
+ libsvmwrite(filename, prhs[1], prhs[2]);
+ else
+ {
+ mexPrintf("Instance_matrix must be sparse\n");
+ return;
+ }
+ }
+ else
+ {
+ exit_with_help();
+ return;
+ }
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/matlab/make.m b/src/backend/app/algorithms/evaluate/libsvm/matlab/make.m
new file mode 100644
index 0000000..276bfae
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/matlab/make.m
@@ -0,0 +1,22 @@
+% This make.m is for MATLAB and OCTAVE under Windows, Mac, and Unix
+function make()
+try
+ % This part is for OCTAVE
+ if (exist ('OCTAVE_VERSION', 'builtin'))
+ mex libsvmread.c
+ mex libsvmwrite.c
+ mex -I.. svmtrain.c ../svm.cpp svm_model_matlab.c
+ mex -I.. svmpredict.c ../svm.cpp svm_model_matlab.c
+ % This part is for MATLAB
+ % Add -largeArrayDims on 64-bit machines of MATLAB
+ else
+ mex CFLAGS="\$CFLAGS -std=c99" -largeArrayDims libsvmread.c
+ mex CFLAGS="\$CFLAGS -std=c99" -largeArrayDims libsvmwrite.c
+ mex CFLAGS="\$CFLAGS -std=c99" -I.. -largeArrayDims svmtrain.c ../svm.cpp svm_model_matlab.c
+ mex CFLAGS="\$CFLAGS -std=c99" -I.. -largeArrayDims svmpredict.c ../svm.cpp svm_model_matlab.c
+ end
+catch err
+ fprintf('Error: %s failed (line %d)\n', err.stack(1).file, err.stack(1).line);
+ disp(err.message);
+ fprintf('=> Please check README for detailed instructions.\n');
+end
diff --git a/src/backend/app/algorithms/evaluate/libsvm/matlab/svm_model_matlab.c b/src/backend/app/algorithms/evaluate/libsvm/matlab/svm_model_matlab.c
new file mode 100644
index 0000000..1fea1ba
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/matlab/svm_model_matlab.c
@@ -0,0 +1,374 @@
+#include
+#include
+#include "svm.h"
+
+#include "mex.h"
+
+#ifdef MX_API_VER
+#if MX_API_VER < 0x07030000
+typedef int mwIndex;
+#endif
+#endif
+
+#define NUM_OF_RETURN_FIELD 11
+
+#define Malloc(type,n) (type *)malloc((n)*sizeof(type))
+
+static const char *field_names[] = {
+ "Parameters",
+ "nr_class",
+ "totalSV",
+ "rho",
+ "Label",
+ "sv_indices",
+ "ProbA",
+ "ProbB",
+ "nSV",
+ "sv_coef",
+ "SVs"
+};
+
+const char *model_to_matlab_structure(mxArray *plhs[], int num_of_feature, struct svm_model *model)
+{
+ int i, j, n;
+ double *ptr;
+ mxArray *return_model, **rhs;
+ int out_id = 0;
+
+ rhs = (mxArray **)mxMalloc(sizeof(mxArray *)*NUM_OF_RETURN_FIELD);
+
+ // Parameters
+ rhs[out_id] = mxCreateDoubleMatrix(5, 1, mxREAL);
+ ptr = mxGetPr(rhs[out_id]);
+ ptr[0] = model->param.svm_type;
+ ptr[1] = model->param.kernel_type;
+ ptr[2] = model->param.degree;
+ ptr[3] = model->param.gamma;
+ ptr[4] = model->param.coef0;
+ out_id++;
+
+ // nr_class
+ rhs[out_id] = mxCreateDoubleMatrix(1, 1, mxREAL);
+ ptr = mxGetPr(rhs[out_id]);
+ ptr[0] = model->nr_class;
+ out_id++;
+
+ // total SV
+ rhs[out_id] = mxCreateDoubleMatrix(1, 1, mxREAL);
+ ptr = mxGetPr(rhs[out_id]);
+ ptr[0] = model->l;
+ out_id++;
+
+ // rho
+ n = model->nr_class*(model->nr_class-1)/2;
+ rhs[out_id] = mxCreateDoubleMatrix(n, 1, mxREAL);
+ ptr = mxGetPr(rhs[out_id]);
+ for(i = 0; i < n; i++)
+ ptr[i] = model->rho[i];
+ out_id++;
+
+ // Label
+ if(model->label)
+ {
+ rhs[out_id] = mxCreateDoubleMatrix(model->nr_class, 1, mxREAL);
+ ptr = mxGetPr(rhs[out_id]);
+ for(i = 0; i < model->nr_class; i++)
+ ptr[i] = model->label[i];
+ }
+ else
+ rhs[out_id] = mxCreateDoubleMatrix(0, 0, mxREAL);
+ out_id++;
+
+ // sv_indices
+ if(model->sv_indices)
+ {
+ rhs[out_id] = mxCreateDoubleMatrix(model->l, 1, mxREAL);
+ ptr = mxGetPr(rhs[out_id]);
+ for(i = 0; i < model->l; i++)
+ ptr[i] = model->sv_indices[i];
+ }
+ else
+ rhs[out_id] = mxCreateDoubleMatrix(0, 0, mxREAL);
+ out_id++;
+
+ // probA
+ if(model->probA != NULL)
+ {
+ rhs[out_id] = mxCreateDoubleMatrix(n, 1, mxREAL);
+ ptr = mxGetPr(rhs[out_id]);
+ for(i = 0; i < n; i++)
+ ptr[i] = model->probA[i];
+ }
+ else
+ rhs[out_id] = mxCreateDoubleMatrix(0, 0, mxREAL);
+ out_id ++;
+
+ // probB
+ if(model->probB != NULL)
+ {
+ rhs[out_id] = mxCreateDoubleMatrix(n, 1, mxREAL);
+ ptr = mxGetPr(rhs[out_id]);
+ for(i = 0; i < n; i++)
+ ptr[i] = model->probB[i];
+ }
+ else
+ rhs[out_id] = mxCreateDoubleMatrix(0, 0, mxREAL);
+ out_id++;
+
+ // nSV
+ if(model->nSV)
+ {
+ rhs[out_id] = mxCreateDoubleMatrix(model->nr_class, 1, mxREAL);
+ ptr = mxGetPr(rhs[out_id]);
+ for(i = 0; i < model->nr_class; i++)
+ ptr[i] = model->nSV[i];
+ }
+ else
+ rhs[out_id] = mxCreateDoubleMatrix(0, 0, mxREAL);
+ out_id++;
+
+ // sv_coef
+ rhs[out_id] = mxCreateDoubleMatrix(model->l, model->nr_class-1, mxREAL);
+ ptr = mxGetPr(rhs[out_id]);
+ for(i = 0; i < model->nr_class-1; i++)
+ for(j = 0; j < model->l; j++)
+ ptr[(i*(model->l))+j] = model->sv_coef[i][j];
+ out_id++;
+
+ // SVs
+ {
+ int ir_index, nonzero_element;
+ mwIndex *ir, *jc;
+ mxArray *pprhs[1], *pplhs[1];
+
+ if(model->param.kernel_type == PRECOMPUTED)
+ {
+ nonzero_element = model->l;
+ num_of_feature = 1;
+ }
+ else
+ {
+ nonzero_element = 0;
+ for(i = 0; i < model->l; i++) {
+ j = 0;
+ while(model->SV[i][j].index != -1)
+ {
+ nonzero_element++;
+ j++;
+ }
+ }
+ }
+
+ // SV in column, easier accessing
+ rhs[out_id] = mxCreateSparse(num_of_feature, model->l, nonzero_element, mxREAL);
+ ir = mxGetIr(rhs[out_id]);
+ jc = mxGetJc(rhs[out_id]);
+ ptr = mxGetPr(rhs[out_id]);
+ jc[0] = ir_index = 0;
+ for(i = 0;i < model->l; i++)
+ {
+ if(model->param.kernel_type == PRECOMPUTED)
+ {
+ // make a (1 x model->l) matrix
+ ir[ir_index] = 0;
+ ptr[ir_index] = model->SV[i][0].value;
+ ir_index++;
+ jc[i+1] = jc[i] + 1;
+ }
+ else
+ {
+ int x_index = 0;
+ while (model->SV[i][x_index].index != -1)
+ {
+ ir[ir_index] = model->SV[i][x_index].index - 1;
+ ptr[ir_index] = model->SV[i][x_index].value;
+ ir_index++, x_index++;
+ }
+ jc[i+1] = jc[i] + x_index;
+ }
+ }
+ // transpose back to SV in row
+ pprhs[0] = rhs[out_id];
+ if(mexCallMATLAB(1, pplhs, 1, pprhs, "transpose"))
+ return "cannot transpose SV matrix";
+ rhs[out_id] = pplhs[0];
+ out_id++;
+ }
+
+ /* Create a struct matrix contains NUM_OF_RETURN_FIELD fields */
+ return_model = mxCreateStructMatrix(1, 1, NUM_OF_RETURN_FIELD, field_names);
+
+ /* Fill struct matrix with input arguments */
+ for(i = 0; i < NUM_OF_RETURN_FIELD; i++)
+ mxSetField(return_model,0,field_names[i],mxDuplicateArray(rhs[i]));
+ /* return */
+ plhs[0] = return_model;
+ mxFree(rhs);
+
+ return NULL;
+}
+
+struct svm_model *matlab_matrix_to_model(const mxArray *matlab_struct, const char **msg)
+{
+ int i, j, n, num_of_fields;
+ double *ptr;
+ int id = 0;
+ struct svm_node *x_space;
+ struct svm_model *model;
+ mxArray **rhs;
+
+ num_of_fields = mxGetNumberOfFields(matlab_struct);
+ if(num_of_fields != NUM_OF_RETURN_FIELD)
+ {
+ *msg = "number of return field is not correct";
+ return NULL;
+ }
+ rhs = (mxArray **) mxMalloc(sizeof(mxArray *)*num_of_fields);
+
+ for(i=0;irho = NULL;
+ model->probA = NULL;
+ model->probB = NULL;
+ model->label = NULL;
+ model->sv_indices = NULL;
+ model->nSV = NULL;
+ model->free_sv = 1; // XXX
+
+ ptr = mxGetPr(rhs[id]);
+ model->param.svm_type = (int)ptr[0];
+ model->param.kernel_type = (int)ptr[1];
+ model->param.degree = (int)ptr[2];
+ model->param.gamma = ptr[3];
+ model->param.coef0 = ptr[4];
+ id++;
+
+ ptr = mxGetPr(rhs[id]);
+ model->nr_class = (int)ptr[0];
+ id++;
+
+ ptr = mxGetPr(rhs[id]);
+ model->l = (int)ptr[0];
+ id++;
+
+ // rho
+ n = model->nr_class * (model->nr_class-1)/2;
+ model->rho = (double*) malloc(n*sizeof(double));
+ ptr = mxGetPr(rhs[id]);
+ for(i=0;irho[i] = ptr[i];
+ id++;
+
+ // label
+ if(mxIsEmpty(rhs[id]) == 0)
+ {
+ model->label = (int*) malloc(model->nr_class*sizeof(int));
+ ptr = mxGetPr(rhs[id]);
+ for(i=0;inr_class;i++)
+ model->label[i] = (int)ptr[i];
+ }
+ id++;
+
+ // sv_indices
+ if(mxIsEmpty(rhs[id]) == 0)
+ {
+ model->sv_indices = (int*) malloc(model->l*sizeof(int));
+ ptr = mxGetPr(rhs[id]);
+ for(i=0;il;i++)
+ model->sv_indices[i] = (int)ptr[i];
+ }
+ id++;
+
+ // probA
+ if(mxIsEmpty(rhs[id]) == 0)
+ {
+ model->probA = (double*) malloc(n*sizeof(double));
+ ptr = mxGetPr(rhs[id]);
+ for(i=0;iprobA[i] = ptr[i];
+ }
+ id++;
+
+ // probB
+ if(mxIsEmpty(rhs[id]) == 0)
+ {
+ model->probB = (double*) malloc(n*sizeof(double));
+ ptr = mxGetPr(rhs[id]);
+ for(i=0;iprobB[i] = ptr[i];
+ }
+ id++;
+
+ // nSV
+ if(mxIsEmpty(rhs[id]) == 0)
+ {
+ model->nSV = (int*) malloc(model->nr_class*sizeof(int));
+ ptr = mxGetPr(rhs[id]);
+ for(i=0;inr_class;i++)
+ model->nSV[i] = (int)ptr[i];
+ }
+ id++;
+
+ // sv_coef
+ ptr = mxGetPr(rhs[id]);
+ model->sv_coef = (double**) malloc((model->nr_class-1)*sizeof(double));
+ for( i=0 ; i< model->nr_class -1 ; i++ )
+ model->sv_coef[i] = (double*) malloc((model->l)*sizeof(double));
+ for(i = 0; i < model->nr_class - 1; i++)
+ for(j = 0; j < model->l; j++)
+ model->sv_coef[i][j] = ptr[i*(model->l)+j];
+ id++;
+
+ // SV
+ {
+ int sr, elements;
+ int num_samples;
+ mwIndex *ir, *jc;
+ mxArray *pprhs[1], *pplhs[1];
+
+ // transpose SV
+ pprhs[0] = rhs[id];
+ if(mexCallMATLAB(1, pplhs, 1, pprhs, "transpose"))
+ {
+ svm_free_and_destroy_model(&model);
+ *msg = "cannot transpose SV matrix";
+ return NULL;
+ }
+ rhs[id] = pplhs[0];
+
+ sr = (int)mxGetN(rhs[id]);
+
+ ptr = mxGetPr(rhs[id]);
+ ir = mxGetIr(rhs[id]);
+ jc = mxGetJc(rhs[id]);
+
+ num_samples = (int)mxGetNzmax(rhs[id]);
+
+ elements = num_samples + sr;
+
+ model->SV = (struct svm_node **) malloc(sr * sizeof(struct svm_node *));
+ x_space = (struct svm_node *)malloc(elements * sizeof(struct svm_node));
+
+ // SV is in column
+ for(i=0;iSV[i] = &x_space[low+i];
+ for(j=low;jSV[i][x_index].index = (int)ir[j] + 1;
+ model->SV[i][x_index].value = ptr[j];
+ x_index++;
+ }
+ model->SV[i][x_index].index = -1;
+ }
+
+ id++;
+ }
+ mxFree(rhs);
+
+ return model;
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/matlab/svm_model_matlab.h b/src/backend/app/algorithms/evaluate/libsvm/matlab/svm_model_matlab.h
new file mode 100644
index 0000000..3668a84
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/matlab/svm_model_matlab.h
@@ -0,0 +1,2 @@
+const char *model_to_matlab_structure(mxArray *plhs[], int num_of_feature, struct svm_model *model);
+struct svm_model *matlab_matrix_to_model(const mxArray *matlab_struct, const char **error_message);
diff --git a/src/backend/app/algorithms/evaluate/libsvm/matlab/svmpredict.c b/src/backend/app/algorithms/evaluate/libsvm/matlab/svmpredict.c
new file mode 100644
index 0000000..96fedbc
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/matlab/svmpredict.c
@@ -0,0 +1,370 @@
+#include
+#include
+#include
+#include "svm.h"
+
+#include "mex.h"
+#include "svm_model_matlab.h"
+
+#ifdef MX_API_VER
+#if MX_API_VER < 0x07030000
+typedef int mwIndex;
+#endif
+#endif
+
+#define CMD_LEN 2048
+
+int print_null(const char *s,...) {}
+int (*info)(const char *fmt,...) = &mexPrintf;
+
+void read_sparse_instance(const mxArray *prhs, int index, struct svm_node *x)
+{
+ int i, j, low, high;
+ mwIndex *ir, *jc;
+ double *samples;
+
+ ir = mxGetIr(prhs);
+ jc = mxGetJc(prhs);
+ samples = mxGetPr(prhs);
+
+ // each column is one instance
+ j = 0;
+ low = (int)jc[index], high = (int)jc[index+1];
+ for(i=low;iparam.kernel_type == PRECOMPUTED)
+ {
+ // precomputed kernel requires dense matrix, so we make one
+ mxArray *rhs[1], *lhs[1];
+ rhs[0] = mxDuplicateArray(prhs[1]);
+ if(mexCallMATLAB(1, lhs, 1, rhs, "full"))
+ {
+ mexPrintf("Error: cannot full testing instance matrix\n");
+ fake_answer(nlhs, plhs);
+ return;
+ }
+ ptr_instance = mxGetPr(lhs[0]);
+ mxDestroyArray(rhs[0]);
+ }
+ else
+ {
+ mxArray *pprhs[1];
+ pprhs[0] = mxDuplicateArray(prhs[1]);
+ if(mexCallMATLAB(1, pplhs, 1, pprhs, "transpose"))
+ {
+ mexPrintf("Error: cannot transpose testing instance matrix\n");
+ fake_answer(nlhs, plhs);
+ return;
+ }
+ }
+ }
+
+ if(predict_probability)
+ {
+ if(svm_type==NU_SVR || svm_type==EPSILON_SVR)
+ info("Prob. model for test data: target value = predicted value + z,\nz: Laplace distribution e^(-|z|/sigma)/(2sigma),sigma=%g\n",svm_get_svr_probability(model));
+ else
+ prob_estimates = (double *) malloc(nr_class*sizeof(double));
+ }
+
+ tplhs[0] = mxCreateDoubleMatrix(testing_instance_number, 1, mxREAL);
+ if(predict_probability)
+ {
+ // prob estimates are in plhs[2]
+ if(svm_type==C_SVC || svm_type==NU_SVC)
+ tplhs[2] = mxCreateDoubleMatrix(testing_instance_number, nr_class, mxREAL);
+ else
+ tplhs[2] = mxCreateDoubleMatrix(0, 0, mxREAL);
+ }
+ else
+ {
+ // decision values are in plhs[2]
+ if(svm_type == ONE_CLASS ||
+ svm_type == EPSILON_SVR ||
+ svm_type == NU_SVR ||
+ nr_class == 1) // if only one class in training data, decision values are still returned.
+ tplhs[2] = mxCreateDoubleMatrix(testing_instance_number, 1, mxREAL);
+ else
+ tplhs[2] = mxCreateDoubleMatrix(testing_instance_number, nr_class*(nr_class-1)/2, mxREAL);
+ }
+
+ ptr_predict_label = mxGetPr(tplhs[0]);
+ ptr_prob_estimates = mxGetPr(tplhs[2]);
+ ptr_dec_values = mxGetPr(tplhs[2]);
+ x = (struct svm_node*)malloc((feature_number+1)*sizeof(struct svm_node) );
+ for(instance_index=0;instance_indexparam.kernel_type != PRECOMPUTED) // prhs[1]^T is still sparse
+ read_sparse_instance(pplhs[0], instance_index, x);
+ else
+ {
+ for(i=0;i 3 || nrhs > 4 || nrhs < 3)
+ {
+ exit_with_help();
+ fake_answer(nlhs, plhs);
+ return;
+ }
+
+ if(!mxIsDouble(prhs[0]) || !mxIsDouble(prhs[1])) {
+ mexPrintf("Error: label vector and instance matrix must be double\n");
+ fake_answer(nlhs, plhs);
+ return;
+ }
+
+ if(mxIsStruct(prhs[2]))
+ {
+ const char *error_msg;
+
+ // parse options
+ if(nrhs==4)
+ {
+ int i, argc = 1;
+ char cmd[CMD_LEN], *argv[CMD_LEN/2];
+
+ // put options in argv[]
+ mxGetString(prhs[3], cmd, mxGetN(prhs[3]) + 1);
+ if((argv[argc] = strtok(cmd, " ")) != NULL)
+ while((argv[++argc] = strtok(NULL, " ")) != NULL)
+ ;
+
+ for(i=1;i=argc) && argv[i-1][1] != 'q')
+ {
+ exit_with_help();
+ fake_answer(nlhs, plhs);
+ return;
+ }
+ switch(argv[i-1][1])
+ {
+ case 'b':
+ prob_estimate_flag = atoi(argv[i]);
+ break;
+ case 'q':
+ i--;
+ info = &print_null;
+ break;
+ default:
+ mexPrintf("Unknown option: -%c\n", argv[i-1][1]);
+ exit_with_help();
+ fake_answer(nlhs, plhs);
+ return;
+ }
+ }
+ }
+
+ model = matlab_matrix_to_model(prhs[2], &error_msg);
+ if (model == NULL)
+ {
+ mexPrintf("Error: can't read model: %s\n", error_msg);
+ fake_answer(nlhs, plhs);
+ return;
+ }
+
+ if(prob_estimate_flag)
+ {
+ if(svm_check_probability_model(model)==0)
+ {
+ mexPrintf("Model does not support probabiliy estimates\n");
+ fake_answer(nlhs, plhs);
+ svm_free_and_destroy_model(&model);
+ return;
+ }
+ }
+ else
+ {
+ if(svm_check_probability_model(model)!=0)
+ info("Model supports probability estimates, but disabled in predicton.\n");
+ }
+
+ predict(nlhs, plhs, prhs, model, prob_estimate_flag);
+ // destroy model
+ svm_free_and_destroy_model(&model);
+ }
+ else
+ {
+ mexPrintf("model file should be a struct array\n");
+ fake_answer(nlhs, plhs);
+ }
+
+ return;
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/matlab/svmtrain.c b/src/backend/app/algorithms/evaluate/libsvm/matlab/svmtrain.c
new file mode 100644
index 0000000..27a52b8
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/matlab/svmtrain.c
@@ -0,0 +1,495 @@
+#include
+#include
+#include
+#include
+#include "svm.h"
+
+#include "mex.h"
+#include "svm_model_matlab.h"
+
+#ifdef MX_API_VER
+#if MX_API_VER < 0x07030000
+typedef int mwIndex;
+#endif
+#endif
+
+#define CMD_LEN 2048
+#define Malloc(type,n) (type *)malloc((n)*sizeof(type))
+
+void print_null(const char *s) {}
+void print_string_matlab(const char *s) {mexPrintf(s);}
+
+void exit_with_help()
+{
+ mexPrintf(
+ "Usage: model = svmtrain(training_label_vector, training_instance_matrix, 'libsvm_options');\n"
+ "libsvm_options:\n"
+ "-s svm_type : set type of SVM (default 0)\n"
+ " 0 -- C-SVC (multi-class classification)\n"
+ " 1 -- nu-SVC (multi-class classification)\n"
+ " 2 -- one-class SVM\n"
+ " 3 -- epsilon-SVR (regression)\n"
+ " 4 -- nu-SVR (regression)\n"
+ "-t kernel_type : set type of kernel function (default 2)\n"
+ " 0 -- linear: u'*v\n"
+ " 1 -- polynomial: (gamma*u'*v + coef0)^degree\n"
+ " 2 -- radial basis function: exp(-gamma*|u-v|^2)\n"
+ " 3 -- sigmoid: tanh(gamma*u'*v + coef0)\n"
+ " 4 -- precomputed kernel (kernel values in training_instance_matrix)\n"
+ "-d degree : set degree in kernel function (default 3)\n"
+ "-g gamma : set gamma in kernel function (default 1/num_features)\n"
+ "-r coef0 : set coef0 in kernel function (default 0)\n"
+ "-c cost : set the parameter C of C-SVC, epsilon-SVR, and nu-SVR (default 1)\n"
+ "-n nu : set the parameter nu of nu-SVC, one-class SVM, and nu-SVR (default 0.5)\n"
+ "-p epsilon : set the epsilon in loss function of epsilon-SVR (default 0.1)\n"
+ "-m cachesize : set cache memory size in MB (default 100)\n"
+ "-e epsilon : set tolerance of termination criterion (default 0.001)\n"
+ "-h shrinking : whether to use the shrinking heuristics, 0 or 1 (default 1)\n"
+ "-b probability_estimates : whether to train a SVC or SVR model for probability estimates, 0 or 1 (default 0)\n"
+ "-wi weight : set the parameter C of class i to weight*C, for C-SVC (default 1)\n"
+ "-v n : n-fold cross validation mode\n"
+ "-q : quiet mode (no outputs)\n"
+ );
+}
+
+// svm arguments
+struct svm_parameter param; // set by parse_command_line
+struct svm_problem prob; // set by read_problem
+struct svm_model *model;
+struct svm_node *x_space;
+int cross_validation;
+int nr_fold;
+
+
+double do_cross_validation()
+{
+ int i;
+ int total_correct = 0;
+ double total_error = 0;
+ double sumv = 0, sumy = 0, sumvv = 0, sumyy = 0, sumvy = 0;
+ double *target = Malloc(double,prob.l);
+ double retval = 0.0;
+
+ svm_cross_validation(&prob,¶m,nr_fold,target);
+ if(param.svm_type == EPSILON_SVR ||
+ param.svm_type == NU_SVR)
+ {
+ for(i=0;i 2)
+ {
+ // put options in argv[]
+ mxGetString(prhs[2], cmd, mxGetN(prhs[2]) + 1);
+ if((argv[argc] = strtok(cmd, " ")) != NULL)
+ while((argv[++argc] = strtok(NULL, " ")) != NULL)
+ ;
+ }
+
+ // parse options
+ for(i=1;i=argc && argv[i-1][1] != 'q') // since option -q has no parameter
+ return 1;
+ switch(argv[i-1][1])
+ {
+ case 's':
+ param.svm_type = atoi(argv[i]);
+ break;
+ case 't':
+ param.kernel_type = atoi(argv[i]);
+ break;
+ case 'd':
+ param.degree = atoi(argv[i]);
+ break;
+ case 'g':
+ param.gamma = atof(argv[i]);
+ break;
+ case 'r':
+ param.coef0 = atof(argv[i]);
+ break;
+ case 'n':
+ param.nu = atof(argv[i]);
+ break;
+ case 'm':
+ param.cache_size = atof(argv[i]);
+ break;
+ case 'c':
+ param.C = atof(argv[i]);
+ break;
+ case 'e':
+ param.eps = atof(argv[i]);
+ break;
+ case 'p':
+ param.p = atof(argv[i]);
+ break;
+ case 'h':
+ param.shrinking = atoi(argv[i]);
+ break;
+ case 'b':
+ param.probability = atoi(argv[i]);
+ break;
+ case 'q':
+ print_func = &print_null;
+ i--;
+ break;
+ case 'v':
+ cross_validation = 1;
+ nr_fold = atoi(argv[i]);
+ if(nr_fold < 2)
+ {
+ mexPrintf("n-fold cross validation: n must >= 2\n");
+ return 1;
+ }
+ break;
+ case 'w':
+ ++param.nr_weight;
+ param.weight_label = (int *)realloc(param.weight_label,sizeof(int)*param.nr_weight);
+ param.weight = (double *)realloc(param.weight,sizeof(double)*param.nr_weight);
+ param.weight_label[param.nr_weight-1] = atoi(&argv[i-1][2]);
+ param.weight[param.nr_weight-1] = atof(argv[i]);
+ break;
+ default:
+ mexPrintf("Unknown option -%c\n", argv[i-1][1]);
+ return 1;
+ }
+ }
+
+ svm_set_print_string_function(print_func);
+
+ return 0;
+}
+
+// read in a problem (in svmlight format)
+int read_problem_dense(const mxArray *label_vec, const mxArray *instance_mat)
+{
+ // using size_t due to the output type of matlab functions
+ size_t i, j, k, l;
+ size_t elements, max_index, sc, label_vector_row_num;
+ double *samples, *labels;
+
+ prob.x = NULL;
+ prob.y = NULL;
+ x_space = NULL;
+
+ labels = mxGetPr(label_vec);
+ samples = mxGetPr(instance_mat);
+ sc = mxGetN(instance_mat);
+
+ elements = 0;
+ // number of instances
+ l = mxGetM(instance_mat);
+ label_vector_row_num = mxGetM(label_vec);
+ prob.l = (int)l;
+
+ if(label_vector_row_num!=l)
+ {
+ mexPrintf("Length of label vector does not match # of instances.\n");
+ return -1;
+ }
+
+ if(param.kernel_type == PRECOMPUTED)
+ elements = l * (sc + 1);
+ else
+ {
+ for(i = 0; i < l; i++)
+ {
+ for(k = 0; k < sc; k++)
+ if(samples[k * l + i] != 0)
+ elements++;
+ // count the '-1' element
+ elements++;
+ }
+ }
+
+ prob.y = Malloc(double,l);
+ prob.x = Malloc(struct svm_node *,l);
+ x_space = Malloc(struct svm_node, elements);
+
+ max_index = sc;
+ j = 0;
+ for(i = 0; i < l; i++)
+ {
+ prob.x[i] = &x_space[j];
+ prob.y[i] = labels[i];
+
+ for(k = 0; k < sc; k++)
+ {
+ if(param.kernel_type == PRECOMPUTED || samples[k * l + i] != 0)
+ {
+ x_space[j].index = (int)k + 1;
+ x_space[j].value = samples[k * l + i];
+ j++;
+ }
+ }
+ x_space[j++].index = -1;
+ }
+
+ if(param.gamma == 0 && max_index > 0)
+ param.gamma = (double)(1.0/max_index);
+
+ if(param.kernel_type == PRECOMPUTED)
+ for(i=0;i (int)max_index)
+ {
+ mexPrintf("Wrong input format: sample_serial_number out of range\n");
+ return -1;
+ }
+ }
+
+ return 0;
+}
+
+int read_problem_sparse(const mxArray *label_vec, const mxArray *instance_mat)
+{
+ mwIndex *ir, *jc, low, high, k;
+ // using size_t due to the output type of matlab functions
+ size_t i, j, l, elements, max_index, label_vector_row_num;
+ mwSize num_samples;
+ double *samples, *labels;
+ mxArray *instance_mat_col; // transposed instance sparse matrix
+
+ prob.x = NULL;
+ prob.y = NULL;
+ x_space = NULL;
+
+ // transpose instance matrix
+ {
+ mxArray *prhs[1], *plhs[1];
+ prhs[0] = mxDuplicateArray(instance_mat);
+ if(mexCallMATLAB(1, plhs, 1, prhs, "transpose"))
+ {
+ mexPrintf("Error: cannot transpose training instance matrix\n");
+ return -1;
+ }
+ instance_mat_col = plhs[0];
+ mxDestroyArray(prhs[0]);
+ }
+
+ // each column is one instance
+ labels = mxGetPr(label_vec);
+ samples = mxGetPr(instance_mat_col);
+ ir = mxGetIr(instance_mat_col);
+ jc = mxGetJc(instance_mat_col);
+
+ num_samples = mxGetNzmax(instance_mat_col);
+
+ // number of instances
+ l = mxGetN(instance_mat_col);
+ label_vector_row_num = mxGetM(label_vec);
+ prob.l = (int) l;
+
+ if(label_vector_row_num!=l)
+ {
+ mexPrintf("Length of label vector does not match # of instances.\n");
+ return -1;
+ }
+
+ elements = num_samples + l;
+ max_index = mxGetM(instance_mat_col);
+
+ prob.y = Malloc(double,l);
+ prob.x = Malloc(struct svm_node *,l);
+ x_space = Malloc(struct svm_node, elements);
+
+ j = 0;
+ for(i=0;i 0)
+ param.gamma = (double)(1.0/max_index);
+
+ return 0;
+}
+
+static void fake_answer(int nlhs, mxArray *plhs[])
+{
+ int i;
+ for(i=0;i 1)
+ {
+ exit_with_help();
+ fake_answer(nlhs, plhs);
+ return;
+ }
+
+ // Transform the input Matrix to libsvm format
+ if(nrhs > 1 && nrhs < 4)
+ {
+ int err;
+
+ if(!mxIsDouble(prhs[0]) || !mxIsDouble(prhs[1]))
+ {
+ mexPrintf("Error: label vector and instance matrix must be double\n");
+ fake_answer(nlhs, plhs);
+ return;
+ }
+
+ if(mxIsSparse(prhs[0]))
+ {
+ mexPrintf("Error: label vector should not be in sparse format\n");
+ fake_answer(nlhs, plhs);
+ return;
+ }
+
+ if(parse_command_line(nrhs, prhs, NULL))
+ {
+ exit_with_help();
+ svm_destroy_param(¶m);
+ fake_answer(nlhs, plhs);
+ return;
+ }
+
+ if(mxIsSparse(prhs[1]))
+ {
+ if(param.kernel_type == PRECOMPUTED)
+ {
+ // precomputed kernel requires dense matrix, so we make one
+ mxArray *rhs[1], *lhs[1];
+
+ rhs[0] = mxDuplicateArray(prhs[1]);
+ if(mexCallMATLAB(1, lhs, 1, rhs, "full"))
+ {
+ mexPrintf("Error: cannot generate a full training instance matrix\n");
+ svm_destroy_param(¶m);
+ fake_answer(nlhs, plhs);
+ return;
+ }
+ err = read_problem_dense(prhs[0], lhs[0]);
+ mxDestroyArray(lhs[0]);
+ mxDestroyArray(rhs[0]);
+ }
+ else
+ err = read_problem_sparse(prhs[0], prhs[1]);
+ }
+ else
+ err = read_problem_dense(prhs[0], prhs[1]);
+
+ // svmtrain's original code
+ error_msg = svm_check_parameter(&prob, ¶m);
+
+ if(err || error_msg)
+ {
+ if (error_msg != NULL)
+ mexPrintf("Error: %s\n", error_msg);
+ svm_destroy_param(¶m);
+ free(prob.y);
+ free(prob.x);
+ free(x_space);
+ fake_answer(nlhs, plhs);
+ return;
+ }
+
+ if(cross_validation)
+ {
+ double *ptr;
+ plhs[0] = mxCreateDoubleMatrix(1, 1, mxREAL);
+ ptr = mxGetPr(plhs[0]);
+ ptr[0] = do_cross_validation();
+ }
+ else
+ {
+ int nr_feat = (int)mxGetN(prhs[1]);
+ const char *error_msg;
+ model = svm_train(&prob, ¶m);
+ error_msg = model_to_matlab_structure(plhs, nr_feat, model);
+ if(error_msg)
+ mexPrintf("Error: can't convert libsvm model to matrix structure: %s\n", error_msg);
+ svm_free_and_destroy_model(&model);
+ }
+ svm_destroy_param(¶m);
+ free(prob.y);
+ free(prob.x);
+ free(x_space);
+ }
+ else
+ {
+ exit_with_help();
+ fake_answer(nlhs, plhs);
+ return;
+ }
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/python/Makefile b/src/backend/app/algorithms/evaluate/libsvm/python/Makefile
new file mode 100644
index 0000000..9837052
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/python/Makefile
@@ -0,0 +1,4 @@
+all = lib
+
+lib:
+ make -C .. lib
diff --git a/src/backend/app/algorithms/evaluate/libsvm/python/README b/src/backend/app/algorithms/evaluate/libsvm/python/README
new file mode 100644
index 0000000..cfa6420
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/python/README
@@ -0,0 +1,367 @@
+----------------------------------
+--- Python interface of LIBSVM ---
+----------------------------------
+
+Table of Contents
+=================
+
+- Introduction
+- Installation
+- Quick Start
+- Design Description
+- Data Structures
+- Utility Functions
+- Additional Information
+
+Introduction
+============
+
+Python (http://www.python.org/) is a programming language suitable for rapid
+development. This tool provides a simple Python interface to LIBSVM, a library
+for support vector machines (http://www.csie.ntu.edu.tw/~cjlin/libsvm). The
+interface is very easy to use as the usage is the same as that of LIBSVM. The
+interface is developed with the built-in Python library "ctypes."
+
+Installation
+============
+
+On Unix systems, type
+
+> make
+
+The interface needs only LIBSVM shared library, which is generated by
+the above command. We assume that the shared library is on the LIBSVM
+main directory or in the system path.
+
+For windows, the shared library libsvm.dll for 32-bit python is ready
+in the directory `..\windows'. You can also copy it to the system
+directory (e.g., `C:\WINDOWS\system32\' for Windows XP). To regenerate
+the shared library, please follow the instruction of building windows
+binaries in LIBSVM README.
+
+Quick Start
+===========
+
+There are two levels of usage. The high-level one uses utility functions
+in svmutil.py and the usage is the same as the LIBSVM MATLAB interface.
+
+>>> from svmutil import *
+# Read data in LIBSVM format
+>>> y, x = svm_read_problem('../heart_scale')
+>>> m = svm_train(y[:200], x[:200], '-c 4')
+>>> p_label, p_acc, p_val = svm_predict(y[200:], x[200:], m)
+
+# Construct problem in python format
+# Dense data
+>>> y, x = [1,-1], [[1,0,1], [-1,0,-1]]
+# Sparse data
+>>> y, x = [1,-1], [{1:1, 3:1}, {1:-1,3:-1}]
+>>> prob = svm_problem(y, x)
+>>> param = svm_parameter('-t 0 -c 4 -b 1')
+>>> m = svm_train(prob, param)
+
+# Precomputed kernel data (-t 4)
+# Dense data
+>>> y, x = [1,-1], [[1, 2, -2], [2, -2, 2]]
+# Sparse data
+>>> y, x = [1,-1], [{0:1, 1:2, 2:-2}, {0:2, 1:-2, 2:2}]
+# isKernel=True must be set for precomputed kernel
+>>> prob = svm_problem(y, x, isKernel=True)
+>>> param = svm_parameter('-t 4 -c 4 -b 1')
+>>> m = svm_train(prob, param)
+# For the format of precomputed kernel, please read LIBSVM README.
+
+
+# Other utility functions
+>>> svm_save_model('heart_scale.model', m)
+>>> m = svm_load_model('heart_scale.model')
+>>> p_label, p_acc, p_val = svm_predict(y, x, m, '-b 1')
+>>> ACC, MSE, SCC = evaluations(y, p_label)
+
+# Getting online help
+>>> help(svm_train)
+
+The low-level use directly calls C interfaces imported by svm.py. Note that
+all arguments and return values are in ctypes format. You need to handle them
+carefully.
+
+>>> from svm import *
+>>> prob = svm_problem([1,-1], [{1:1, 3:1}, {1:-1,3:-1}])
+>>> param = svm_parameter('-c 4')
+>>> m = libsvm.svm_train(prob, param) # m is a ctype pointer to an svm_model
+# Convert a Python-format instance to svm_nodearray, a ctypes structure
+>>> x0, max_idx = gen_svm_nodearray({1:1, 3:1})
+>>> label = libsvm.svm_predict(m, x0)
+
+Design Description
+==================
+
+There are two files svm.py and svmutil.py, which respectively correspond to
+low-level and high-level use of the interface.
+
+In svm.py, we adopt the Python built-in library "ctypes," so that
+Python can directly access C structures and interface functions defined
+in svm.h.
+
+While advanced users can use structures/functions in svm.py, to
+avoid handling ctypes structures, in svmutil.py we provide some easy-to-use
+functions. The usage is similar to LIBSVM MATLAB interface.
+
+Data Structures
+===============
+
+Four data structures derived from svm.h are svm_node, svm_problem, svm_parameter,
+and svm_model. They all contain fields with the same names in svm.h. Access
+these fields carefully because you directly use a C structure instead of a
+Python object. For svm_model, accessing the field directly is not recommanded.
+Programmers should use the interface functions or methods of svm_model class
+in Python to get the values. The following description introduces additional
+fields and methods.
+
+Before using the data structures, execute the following command to load the
+LIBSVM shared library:
+
+ >>> from svm import *
+
+- class svm_node:
+
+ Construct an svm_node.
+
+ >>> node = svm_node(idx, val)
+
+ idx: an integer indicates the feature index.
+
+ val: a float indicates the feature value.
+
+ Show the index and the value of a node.
+
+ >>> print(node)
+
+- Function: gen_svm_nodearray(xi [,feature_max=None [,isKernel=False]])
+
+ Generate a feature vector from a Python list/tuple or a dictionary:
+
+ >>> xi, max_idx = gen_svm_nodearray({1:1, 3:1, 5:-2})
+
+ xi: the returned svm_nodearray (a ctypes structure)
+
+ max_idx: the maximal feature index of xi
+
+ feature_max: if feature_max is assigned, features with indices larger than
+ feature_max are removed.
+
+ isKernel: if isKernel == True, the list index starts from 0 for precomputed
+ kernel. Otherwise, the list index starts from 1. The default
+ value is False.
+
+- class svm_problem:
+
+ Construct an svm_problem instance
+
+ >>> prob = svm_problem(y, x)
+
+ y: a Python list/tuple of l labels (type must be int/double).
+
+ x: a Python list/tuple of l data instances. Each element of x must be
+ an instance of list/tuple/dictionary type.
+
+ Note that if your x contains sparse data (i.e., dictionary), the internal
+ ctypes data format is still sparse.
+
+ For pre-computed kernel, the isKernel flag should be set to True:
+
+ >>> prob = svm_problem(y, x, isKernel=True)
+
+ Please read LIBSVM README for more details of pre-computed kernel.
+
+- class svm_parameter:
+
+ Construct an svm_parameter instance
+
+ >>> param = svm_parameter('training_options')
+
+ If 'training_options' is empty, LIBSVM default values are applied.
+
+ Set param to LIBSVM default values.
+
+ >>> param.set_to_default_values()
+
+ Parse a string of options.
+
+ >>> param.parse_options('training_options')
+
+ Show values of parameters.
+
+ >>> print(param)
+
+- class svm_model:
+
+ There are two ways to obtain an instance of svm_model:
+
+ >>> model = svm_train(y, x)
+ >>> model = svm_load_model('model_file_name')
+
+ Note that the returned structure of interface functions
+ libsvm.svm_train and libsvm.svm_load_model is a ctypes pointer of
+ svm_model, which is different from the svm_model object returned
+ by svm_train and svm_load_model in svmutil.py. We provide a
+ function toPyModel for the conversion:
+
+ >>> model_ptr = libsvm.svm_train(prob, param)
+ >>> model = toPyModel(model_ptr)
+
+ If you obtain a model in a way other than the above approaches,
+ handle it carefully to avoid memory leak or segmentation fault.
+
+ Some interface functions to access LIBSVM models are wrapped as
+ members of the class svm_model:
+
+ >>> svm_type = model.get_svm_type()
+ >>> nr_class = model.get_nr_class()
+ >>> svr_probability = model.get_svr_probability()
+ >>> class_labels = model.get_labels()
+ >>> sv_indices = model.get_sv_indices()
+ >>> nr_sv = model.get_nr_sv()
+ >>> is_prob_model = model.is_probability_model()
+ >>> support_vector_coefficients = model.get_sv_coef()
+ >>> support_vectors = model.get_SV()
+
+Utility Functions
+=================
+
+To use utility functions, type
+
+ >>> from svmutil import *
+
+The above command loads
+ svm_train() : train an SVM model
+ svm_predict() : predict testing data
+ svm_read_problem() : read the data from a LIBSVM-format file.
+ svm_load_model() : load a LIBSVM model.
+ svm_save_model() : save model to a file.
+ evaluations() : evaluate prediction results.
+
+- Function: svm_train
+
+ There are three ways to call svm_train()
+
+ >>> model = svm_train(y, x [, 'training_options'])
+ >>> model = svm_train(prob [, 'training_options'])
+ >>> model = svm_train(prob, param)
+
+ y: a list/tuple of l training labels (type must be int/double).
+
+ x: a list/tuple of l training instances. The feature vector of
+ each training instance is an instance of list/tuple or dictionary.
+
+ training_options: a string in the same form as that for LIBSVM command
+ mode.
+
+ prob: an svm_problem instance generated by calling
+ svm_problem(y, x).
+ For pre-computed kernel, you should use
+ svm_problem(y, x, isKernel=True)
+
+ param: an svm_parameter instance generated by calling
+ svm_parameter('training_options')
+
+ model: the returned svm_model instance. See svm.h for details of this
+ structure. If '-v' is specified, cross validation is
+ conducted and the returned model is just a scalar: cross-validation
+ accuracy for classification and mean-squared error for regression.
+
+ To train the same data many times with different
+ parameters, the second and the third ways should be faster..
+
+ Examples:
+
+ >>> y, x = svm_read_problem('../heart_scale')
+ >>> prob = svm_problem(y, x)
+ >>> param = svm_parameter('-s 3 -c 5 -h 0')
+ >>> m = svm_train(y, x, '-c 5')
+ >>> m = svm_train(prob, '-t 2 -c 5')
+ >>> m = svm_train(prob, param)
+ >>> CV_ACC = svm_train(y, x, '-v 3')
+
+- Function: svm_predict
+
+ To predict testing data with a model, use
+
+ >>> p_labs, p_acc, p_vals = svm_predict(y, x, model [,'predicting_options'])
+
+ y: a list/tuple of l true labels (type must be int/double). It is used
+ for calculating the accuracy. Use [0]*len(x) if true labels are
+ unavailable.
+
+ x: a list/tuple of l predicting instances. The feature vector of
+ each predicting instance is an instance of list/tuple or dictionary.
+
+ predicting_options: a string of predicting options in the same format as
+ that of LIBSVM.
+
+ model: an svm_model instance.
+
+ p_labels: a list of predicted labels
+
+ p_acc: a tuple including accuracy (for classification), mean
+ squared error, and squared correlation coefficient (for
+ regression).
+
+ p_vals: a list of decision values or probability estimates (if '-b 1'
+ is specified). If k is the number of classes in training data,
+ for decision values, each element includes results of predicting
+ k(k-1)/2 binary-class SVMs. For classification, k = 1 is a
+ special case. Decision value [+1] is returned for each testing
+ instance, instead of an empty list.
+ For probabilities, each element contains k values indicating
+ the probability that the testing instance is in each class.
+ Note that the order of classes is the same as the 'model.label'
+ field in the model structure.
+
+ Example:
+
+ >>> m = svm_train(y, x, '-c 5')
+ >>> p_labels, p_acc, p_vals = svm_predict(y, x, m)
+
+- Functions: svm_read_problem/svm_load_model/svm_save_model
+
+ See the usage by examples:
+
+ >>> y, x = svm_read_problem('data.txt')
+ >>> m = svm_load_model('model_file')
+ >>> svm_save_model('model_file', m)
+
+- Function: evaluations
+
+ Calculate some evaluations using the true values (ty) and predicted
+ values (pv):
+
+ >>> (ACC, MSE, SCC) = evaluations(ty, pv)
+
+ ty: a list of true values.
+
+ pv: a list of predict values.
+
+ ACC: accuracy.
+
+ MSE: mean squared error.
+
+ SCC: squared correlation coefficient.
+
+
+Additional Information
+======================
+
+This interface was written by Hsiang-Fu Yu from Department of Computer
+Science, National Taiwan University. If you find this tool useful, please
+cite LIBSVM as follows
+
+Chih-Chung Chang and Chih-Jen Lin, LIBSVM : a library for support
+vector machines. ACM Transactions on Intelligent Systems and
+Technology, 2:27:1--27:27, 2011. Software available at
+http://www.csie.ntu.edu.tw/~cjlin/libsvm
+
+For any question, please contact Chih-Jen Lin ,
+or check the FAQ page:
+
+http://www.csie.ntu.edu.tw/~cjlin/libsvm/faq.html
diff --git a/src/backend/app/algorithms/evaluate/libsvm/python/__init__.py b/src/backend/app/algorithms/evaluate/libsvm/python/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/src/backend/app/algorithms/evaluate/libsvm/python/allmodel b/src/backend/app/algorithms/evaluate/libsvm/python/allmodel
new file mode 100644
index 0000000..19237f0
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/python/allmodel
@@ -0,0 +1,778 @@
+svm_type epsilon_svr
+kernel_type rbf
+gamma 0.05
+nr_class 2
+total_sv 770
+rho -155.845
+probA 6.34795
+SV
+-1024 1:-0.597198 2:-0.143425 3:-0.250373 4:0.434151 5:-0.770629 6:-0.120473 7:-0.261166 8:0.437833 9:-0.749714 10:-0.204235 11:-0.170752 12:-0.101459 13:-0.597951 14:-0.584089 15:-0.168594 16:-0.110996 17:-0.594929 18:-0.593067 19:0.41183 20:0.210422 21:0.395286 22:0.165987 23:-0.339455 24:0.391707 25:0.373055 26:0.0738007 27:-0.299791 28:0.239653 29:0.50318 30:-0.356149 31:-0.0408499 32:-0.288528 33:0.533753 34:-0.386524 35:-0.0501816 36:-0.314717
+-1024 1:-0.750162 2:-0.467213 3:-0.516279 4:0.449112 5:-0.921984 6:-0.450775 7:-0.528675 8:0.11969 9:-0.864933 10:-0.64794 11:-0.434466 12:0.150394 13:-0.846166 14:-0.791906 15:-0.432509 16:0.129298 17:-0.843114 18:-0.800605 19:-0.0421003 20:-0.126157 21:-0.0164766 22:0.436398 23:-0.675814 24:0.0658424 25:0.0178726 26:-0.200528 27:-0.481642 28:-0.395445 29:0.118467 30:-0.0683621 31:-0.465541 32:-0.551679 33:0.141398 34:-0.234465 35:-0.431874 36:-0.600774
+1024 1:-0.754095 2:-0.57969 3:-0.545981 4:0.386669 5:-0.95345 6:-0.599779 7:-0.561229 8:0.148354 9:-0.909084 10:-0.719219 11:-0.426844 12:0.221294 13:-0.90574 14:-0.851418 15:-0.424881 16:0.263192 17:-0.90921 18:-0.848744 19:0.0128477 20:-0.273723 21:-0.0329473 22:0.412618 23:-0.750295 24:-0.151472 25:-0.110853 26:0.177895 27:-0.687629 28:-0.364115 29:0.141398 30:-0.229309 31:-0.580118 32:-0.730525 33:0.133754 34:-0.108961 35:-0.608392 36:-0.707541
+1024 1:-0.827579 2:-0.730515 3:-0.684592 4:0.30859 5:-0.987043 6:-0.754401 7:-0.657476 8:0.365962 9:-0.985231 10:-0.802138 11:-0.518305 12:0.473747 13:-0.97618 14:-0.902786 15:-0.520988 16:0.49167 17:-0.976604 18:-0.902236 19:-0.376186 20:-0.488749 21:-0.329417 22:0.329266 23:-0.839657 24:-0.445371 25:-0.392137 26:0.436789 27:-0.866552 28:-0.479491 29:-0.197454 30:-0.0526299 31:-0.741962 32:-0.820072 33:-0.20255 34:-0.0423801 35:-0.743262 36:-0.817102
+1024 1:-0.767757 2:-0.62048 3:-0.567197 4:0.367756 5:-0.963823 6:-0.646388 7:-0.549906 8:0.23761 9:-0.944241 10:-0.754318 11:-0.419222 12:0.339618 13:-0.936645 14:-0.865917 15:-0.418779 16:0.380955 17:-0.939338 18:-0.862834 19:-0.00213742 20:-0.305426 21:-0.0211825 22:0.520895 23:-0.807083 24:-0.171009 25:-0.108469 26:0.191881 27:-0.716547 28:-0.406812 29:0.133754 30:-0.162965 31:-0.626002 32:-0.745449 33:0.118467 34:-0.0535133 35:-0.648119 36:-0.723103
+1024 1:-0.80688 2:-0.577756 3:-0.649232 4:0.268326 5:-0.920659 6:-0.556767 7:-0.661722 8:0.0462289 9:-0.866098 10:-0.683699 11:-0.573181 12:-0.0651311 13:-0.840936 14:-0.856903 15:-0.582008 16:-0.0465824 17:-0.838542 18:-0.854912 19:-0.28439 20:-0.331716 21:-0.235299 22:0.265429 23:-0.719942 24:-0.243311 25:-0.287252 26:0.0841282 27:-0.653742 28:-0.399161 29:-0.057328 30:-0.400365 31:-0.535528 32:-0.770343 33:-0.0751625 34:-0.370103 35:-0.535209 36:-0.758974
+-1024 1:-0.733396 2:-0.252835 3:-0.490819 4:0.389673 5:-0.80005 6:-0.174505 7:-0.498952 8:-0.056824 9:-0.655384 10:-0.405482 11:-0.414649 12:-0.117863 13:-0.659958 14:-0.670002 15:-0.414202 16:-0.194532 17:-0.63791 18:-0.686694 19:0.0665449 20:0.139822 21:0.0988159 22:0.145322 23:-0.358059 24:0.330798 25:0.115606 26:-0.316136 27:-0.163024 28:-0.0445953 29:0.235666 30:-0.212578 31:-0.162703 32:-0.331073 33:0.248405 34:-0.262375 35:-0.146177 36:-0.345406
+-1024 1:-0.806466 2:-0.536068 3:-0.647817 4:0.119674 5:-0.890573 6:-0.601368 7:-0.643322 8:0.319161 9:-0.899704 10:-0.570371 11:-0.571656 12:0.138569 13:-0.860178 14:-0.813805 15:-0.575906 16:0.173889 17:-0.862989 18:-0.810866 19:-0.271901 20:-0.166852 21:-0.195299 22:0.0530176 23:-0.565781 24:-0.197588 25:-0.201435 26:0.230437 27:-0.589918 28:-0.106301 29:-0.100641 30:-0.179108 31:-0.439731 32:-0.593424 33:-0.105736 34:-0.146424 35:-0.44865 36:-0.585002
+-1024 1:-0.816195 2:-0.681926 3:-0.714293 4:0.0499216 5:-0.930058 6:-0.733805 7:-0.697106 8:0.186659 9:-0.943557 10:-0.758924 11:-0.66007 12:-0.00271959 13:-0.893449 14:-0.89908 15:-0.658285 16:0.221992 17:-0.922859 18:-0.876669 19:-0.406784 20:-0.449934 21:-0.46118 22:0.185869 23:-0.751298 24:-0.39387 25:-0.382602 26:0.104513 27:-0.742664 28:-0.541231 29:-0.273888 30:-0.201123 31:-0.660823 32:-0.807371 33:-0.304459 34:-0.106083 35:-0.674502 36:-0.783354
+-1024 1:-0.750162 2:-0.44733 3:-0.554468 4:0.0323952 5:-0.814088 6:-0.525189 7:-0.552737 8:0.412513 9:-0.862639 10:-0.428555 11:-0.414649 12:-0.277997 13:-0.751988 14:-0.824795 15:-0.414202 16:-0.0962064 17:-0.782675 18:-0.804619 19:0.0553058 20:-0.142664 21:-0.0117707 22:-0.0164067 23:-0.480999 24:-0.127488 25:-0.0774796 26:0.303814 27:-0.531896 28:0.103466 29:0.299358 30:-0.903045 31:-0.258641 32:-0.742719 33:0.256048 34:-0.640023 35:-0.317343 36:-0.68181
+-427.0784594459771 1:-0.799428 2:-0.547487 3:-0.642159 4:0.177161 5:-0.91301 6:-0.612435 7:-0.636245 8:0.183091 9:-0.873817 10:-0.609977 11:-0.568607 12:0.173332 13:-0.869457 14:-0.814237 15:-0.575906 16:-0.00903964 17:-0.830472 18:-0.833693 19:-0.183853 20:-0.210562 21:-0.127065 22:-0.0726811 23:-0.561614 24:-0.339713 25:-0.184748 26:0.145947 27:-0.570966 28:-0.174362 29:-0.0293022 30:-0.287888 31:-0.455066 32:-0.655692 33:-0.0675192 34:-0.151269 35:-0.478462 36:-0.613272
+-411.5410311984782 1:-0.646044 2:-0.280814 3:-0.32958 4:0.261539 5:-0.815427 6:-0.369665 7:-0.344675 8:0.265171 9:-0.785062 10:-0.405276 11:-0.344528 12:0.0234771 13:-0.661527 14:-0.608231 15:-0.330298 16:0.00636535 17:-0.670683 18:-0.634358 19:0.255734 20:0.0483862 21:0.221167 22:-0.0654165 23:-0.362803 24:0.0253642 25:0.218107 26:-0.188661 27:-0.307204 28:-0.103013 29:0.26624 30:0.136422 31:-0.292648 32:-0.247005 33:0.322289 34:-0.263028 35:-0.203382 36:-0.395982
+1024 1:-0.793218 2:-0.464121 3:-0.598313 4:0.455156 5:-0.915576 6:-0.39247 7:-0.613598 8:-0.102528 9:-0.776896 10:-0.631424 11:-0.5305 12:-0.0161003 13:-0.794206 14:-0.786841 15:-0.530142 16:-0.0900461 17:-0.774245 18:-0.797865 19:-0.0808169 20:-0.0698727 21:-0.0800064 22:0.369057 23:-0.591043 24:0.181274 25:-0.0560254 26:-0.306348 27:-0.31359 28:-0.269808 29:0.0547751 30:-0.218015 31:-0.356328 32:-0.527599 33:0.0598707 34:-0.239568 35:-0.349962 36:-0.532869
+1024 1:-0.802119 2:-0.561077 3:-0.635087 4:-0.0299323 5:-0.869102 6:-0.671591 7:-0.634829 8:0.397416 9:-0.922555 10:-0.576765 11:-0.591473 12:-0.0846808 13:-0.817238 14:-0.838262 15:-0.592687 16:0.118327 17:-0.850385 18:-0.814091 19:-0.37244 20:-0.33491 21:-0.310594 22:0.00927521 23:-0.639968 24:-0.389091 25:-0.308706 26:0.273034 27:-0.724018 28:-0.327833 29:-0.217837 30:-0.255932 31:-0.524886 32:-0.713183 33:-0.228028 34:-0.192585 35:-0.538847 36:-0.696697
+-1024 1:-0.718078 2:-0.162085 3:-0.355038 4:0.842688 5:-0.849902 6:0.120661 7:-0.317782 8:-0.450263 9:-0.543909 10:-0.526596 11:-0.315566 12:-0.492019 13:-0.465173 14:-0.623233 15:-0.31657 16:-0.371992 17:-0.489195 18:-0.60418 19:-0.136394 20:0.0705487 21:0.0799922 22:0.610825 23:-0.583153 24:0.55625 25:0.125141 26:-0.876925 27:-0.0204984 28:-0.461932 29:0.187258 30:-0.707114 31:-0.00254089 32:-0.479617 33:0.187258 34:-0.578537 35:-0.0567139 36:-0.452994
+-1024 1:-0.7572 2:-0.531582 3:-0.560125 4:0.19147 5:-0.912392 6:-0.618225 7:-0.556983 8:0.3802 9:-0.923511 10:-0.611689 11:-0.463427 12:0.152749 13:-0.869271 14:-0.82163 15:-0.464544 16:0.165832 17:-0.869421 18:-0.822846 19:-0.0539651 20:-0.237854 21:-0.0800064 22:0.114788 23:-0.66631 24:-0.316994 25:-0.0941662 26:0.144154 27:-0.659981 28:-0.346327 29:0.00891474 30:-0.0571892 31:-0.550359 32:-0.633482 33:-0.00891984 34:-0.112242 35:-0.530137 36:-0.640329
+-1024 1:-0.793011 2:-0.474152 3:-0.625187 4:0.0617193 5:-0.843727 6:-0.5499 7:-0.617844 8:0.25398 9:-0.851701 10:-0.512061 11:-0.55489 12:0.00482227 13:-0.796794 14:-0.782876 15:-0.562176 16:0.0563141 17:-0.802202 18:-0.7769 19:-0.203835 20:-0.105843 21:-0.117654 22:0.0180045 23:-0.524351 24:-0.164563 25:-0.134691 26:0.156802 27:-0.520642 28:-0.0653464 29:-0.0420412 30:-0.158776 31:-0.388073 32:-0.531024 33:-0.0471368 34:-0.118587 35:-0.39637 36:-0.51736
+1024 1:-0.663845 2:-0.253597 3:-0.323922 4:0.936985 5:-0.908008 6:-0.00279026 7:-0.302213 8:-0.508275 9:-0.608938 10:-0.640934 11:-0.236299 12:-0.770747 13:-0.530742 14:-0.778675 15:-0.235716 16:-0.491208 17:-0.582456 18:-0.735793 19:0.209529 20:-0.0648295 21:0.242344 22:0.868965 23:-0.762672 24:0.399853 25:0.258632 26:-0.628954 27:-0.313947 28:-0.541719 29:0.439487 30:-0.898035 31:-0.219973 32:-0.705845 33:0.434392 34:-0.788857 35:-0.245884 36:-0.680229
+-1024 1:-0.828821 2:-0.702025 3:-0.701563 4:0.130718 5:-0.96519 6:-0.777706 7:-0.714091 8:0.414503 9:-0.977258 10:-0.723717 11:-0.594522 12:0.343719 13:-0.957107 14:-0.89426 15:-0.600315 16:0.359296 17:-0.956958 18:-0.893383 19:-0.40616 20:-0.393829 21:-0.383533 22:0.0341689 23:-0.700991 24:-0.47312 25:-0.437429 26:0.396716 27:-0.776394 28:-0.287586 29:-0.284079 30:-0.216887 31:-0.615873 32:-0.777137 33:-0.286627 34:-0.139055 35:-0.637843 36:-0.763572
+1024 1:-0.835859 2:-0.843068 3:-0.728437 4:0.131478 5:-0.995819 6:-0.891274 7:-0.728245 8:0.232275 9:-0.994701 10:-0.895779 11:-0.641777 12:0.458785 13:-0.991702 14:-0.939673 15:-0.644555 16:0.47265 17:-0.991805 18:-0.93966 19:-0.306871 20:-0.596819 21:-0.308241 22:0.426359 23:-0.937443 24:-0.630877 25:-0.332544 26:0.444858 27:-0.929431 28:-0.657668 29:-0.174524 30:0.35714 31:-0.892635 32:-0.841947 33:-0.174524 34:0.341911 35:-0.890638 36:-0.843604
+1024 1:-0.802119 2:-0.59002 3:-0.643574 4:-0.0825782 5:-0.867018 6:-0.704044 7:-0.624921 8:0.676035 9:-0.97314 10:-0.559346 11:-0.522878 12:0.0121643 13:-0.872589 14:-0.868388 15:-0.520988 16:0.123966 17:-0.886095 18:-0.857503 19:-0.159499 20:-0.281325 21:-0.150593 22:-0.059739 23:-0.568992 24:-0.338564 25:-0.218122 26:0.488505 27:-0.71516 28:-0.0521416 29:0.092992 30:-0.681965 31:-0.428769 32:-0.787667 33:0.0777053 34:-0.417803 35:-0.498441 36:-0.739911
+1024 1:-0.810192 2:-0.668037 3:-0.613871 4:0.156041 5:-0.962099 6:-0.769391 7:-0.650399 8:0.639437 9:-0.990493 10:-0.674078 11:-0.503061 12:0.510944 13:-0.963245 14:-0.865039 15:-0.505733 16:0.505195 17:-0.962098 18:-0.867358 19:-0.215076 20:-0.282302 21:-0.225887 22:0.0230908 23:-0.619503 24:-0.334106 25:-0.249111 26:0.707337 27:-0.811463 28:-0.0223151 29:-0.0293022 30:-0.112288 31:-0.591812 32:-0.699367 33:-0.0216588 34:-0.195535 35:-0.573772 36:-0.717595
+943.3646940970501 1:-0.812262 2:-0.434076 3:-0.602556 4:0.850041 5:-0.957628 6:-0.155058 7:-0.589536 8:-0.633411 9:-0.635276 10:-0.7711 11:-0.545743 12:-0.467752 13:-0.664857 14:-0.818454 15:-0.553025 16:-0.50833 17:-0.648009 18:-0.827588 19:-0.353081 20:-0.204011 21:-0.178828 22:0.778524 23:-0.80824 24:0.220693 25:-0.168062 26:-0.821186 27:-0.25761 28:-0.69896 29:-0.0726148 30:-0.68323 31:-0.300043 32:-0.71512 33:-0.0828059 34:-0.714795 35:-0.285222 36:-0.718875
+1024 1:-0.835652 2:-0.813601 3:-0.741167 4:0.234848 5:-0.996914 6:-0.84419 7:-0.702767 8:0.248381 9:-0.992788 10:-0.883457 11:-0.614339 12:0.523812 13:-0.991364 14:-0.926135 15:-0.61557 16:0.524291 17:-0.9909 18:-0.92718 19:-0.366195 20:-0.534926 21:-0.334121 22:0.535648 23:-0.924937 24:-0.474284 25:-0.377834 26:0.270691 27:-0.869732 28:-0.645708 29:-0.238219 30:0.259388 31:-0.843803 32:-0.814409 33:-0.235671 34:0.22642 35:-0.841082 36:-0.822276
+1024 1:-0.758235 2:-0.452418 3:-0.565783 4:0.0157567 5:-0.81035 6:-0.530048 7:-0.56406 8:0.42761 9:-0.868833 10:-0.428876 11:-0.428368 12:-0.271463 13:-0.755565 14:-0.826935 15:-0.427932 16:-0.0954375 17:-0.785189 18:-0.80754 19:0.0259594 20:-0.146023 21:-0.0211825 22:-0.0392872 23:-0.467473 24:-0.132679 25:-0.0917824 26:0.287693 27:-0.524624 28:0.0976285 29:0.284074 30:-0.916393 31:-0.255876 32:-0.747064 33:0.240762 34:-0.646359 35:-0.31625 36:-0.684659
+-1024 1:-0.549595 2:-0.0927019 3:-0.166925 4:0.483957 5:-0.764041 6:-0.0720085 7:-0.170581 8:0.510928 9:-0.746286 10:-0.148869 11:-0.111302 12:-0.0831255 13:-0.557791 14:-0.527611 15:-0.113675 16:-0.0908151 17:-0.554598 18:-0.536008 19:0.484883 20:0.248915 21:0.442343 22:0.141216 23:-0.293771 24:0.444804 25:0.439799 26:0.0274228 27:-0.25333 28:0.263233 29:0.564327 30:-0.370431 31:0.00666342 32:-0.24978 33:0.589805 34:-0.398254 35:-0.000332883 36:-0.273091
+-1024 1:-0.835031 2:-0.497447 3:-0.670448 4:0.792408 5:-0.969594 6:-0.227144 7:-0.667384 8:-0.683997 9:-0.642727 10:-0.819249 11:-0.623485 12:-0.455457 13:-0.696291 14:-0.849766 15:-0.630825 16:-0.492703 17:-0.68032 18:-0.858242 19:-0.401165 20:-0.255757 21:-0.223534 22:0.872863 23:-0.862044 24:0.184175 25:-0.234808 26:-0.816871 27:-0.286914 28:-0.734944 29:-0.131212 30:-0.636356 31:-0.361063 32:-0.74365 33:-0.136307 34:-0.670109 35:-0.346814 36:-0.747628
+1024 1:-0.844553 2:-0.82547 3:-0.806228 4:-0.011664 5:-0.972416 6:-0.863427 7:-0.784861 8:0.185095 9:-0.987091 10:-0.872646 11:-0.757628 12:0.191292 13:-0.964965 14:-0.940165 15:-0.737611 16:0.358194 17:-0.979425 18:-0.930281 19:-0.549786 20:-0.622245 21:-0.621179 22:0.23294 23:-0.869684 24:-0.592898 25:-0.587607 26:0.180851 27:-0.866693 28:-0.707359 29:-0.492996 30:0.109775 31:-0.841358 32:-0.866605 33:-0.505735 34:0.137008 35:-0.844253 36:-0.859514
+-1024 1:-0.778728 2:-0.487993 3:-0.595484 4:0.219852 5:-0.868436 6:-0.476405 7:-0.605106 8:0.0690168 9:-0.850812 10:-0.645437 11:-0.532024 12:-0.081353 13:-0.805785 14:-0.822407 15:-0.537769 16:-0.0790328 17:-0.802103 18:-0.824393 19:-0.148883 20:-0.200664 21:-0.155299 22:0.338832 23:-0.647976 24:0.0128201 25:-0.139459 26:-0.270134 27:-0.4388 28:-0.417452 29:0.0369405 30:-0.486413 31:-0.40429 32:-0.697966 33:0.0318449 34:-0.501991 35:-0.399541 36:-0.700902
+1024 1:-0.850142 2:-0.69463 3:-0.760968 4:0.128498 5:-0.954519 6:-0.731284 7:-0.770707 8:0.0289113 9:-0.900615 10:-0.74869 11:-0.704275 12:0.154686 13:-0.915115 14:-0.880687 15:-0.72083 16:0.0227529 17:-0.886098 18:-0.890736 19:-0.378059 20:-0.391214 21:-0.369415 22:-0.00317163 23:-0.684469 24:-0.484133 25:-0.399288 26:0.135788 27:-0.67645 28:-0.384925 29:-0.266245 30:-0.0897008 31:-0.645257 32:-0.749238 33:-0.291723 34:-0.034391 35:-0.64882 36:-0.729146
+1024 1:-0.679162 2:-0.309982 3:-0.408785 4:0.191677 5:-0.810516 6:-0.405883 7:-0.38572 8:0.792222 9:-0.897782 10:-0.267411 11:-0.250018 12:0.159743 13:-0.777168 14:-0.698105 15:-0.255548 16:0.0918917 17:-0.764547 18:-0.71368 19:0.210153 20:0.108241 21:0.225873 22:-0.0262385 23:-0.363415 24:0.0761026 25:0.163282 26:0.379494 27:-0.458151 28:0.350073 29:0.380888 30:-0.311586 31:-0.175479 32:-0.393923 33:0.385984 34:-0.466738 35:-0.130797 36:-0.439967
+1024 1:-0.855731 2:-0.770504 3:-0.796328 4:0.127422 5:-0.977234 6:-0.795495 7:-0.796184 8:0.133196 9:-0.962147 10:-0.819254 11:-0.733238 12:0.269152 13:-0.960008 14:-0.913618 15:-0.733035 16:0.257442 17:-0.958133 18:-0.917862 19:-0.593498 20:-0.537209 21:-0.595296 22:0.08949 23:-0.788701 24:-0.577859 25:-0.616212 26:0.0177078 27:-0.725067 28:-0.604365 29:-0.549045 30:0.0440187 31:-0.740143 32:-0.794836 33:-0.543949 34:-0.0508287 35:-0.719922 36:-0.814728
+-665.7197624629371 1:-0.849728 2:-0.709512 3:-0.775112 4:0.382929 5:-0.981961 6:-0.628472 7:-0.74806 8:-0.0245649 9:-0.93393 10:-0.845918 11:-0.687507 12:0.17567 13:-0.93652 14:-0.904239 15:-0.67354 16:0.258886 17:-0.945376 18:-0.89915 19:-0.537297 20:-0.424537 21:-0.458827 22:0.723325 23:-0.892527 24:-0.0647632 25:-0.523246 26:-0.361137 27:-0.530286 28:-0.681083 29:-0.393632 30:-0.495193 31:-0.524799 32:-0.827065 33:-0.391084 34:-0.374101 35:-0.566079 36:-0.80849
+-1024 1:-0.802947 2:-0.584108 3:-0.595484 4:0.500499 5:-0.97446 6:-0.58334 7:-0.602275 8:0.513941 9:-0.964038 10:-0.64194 11:-0.43599 12:0.573303 13:-0.952616 14:-0.827614 15:-0.443186 16:0.567678 17:-0.95049 18:-0.827607 19:-0.283766 20:-0.176293 21:-0.174122 22:0.409811 23:-0.650711 24:0.111958 25:-0.208586 26:0.260605 27:-0.624843 28:-0.139579 29:-0.0293022 30:-0.341037 31:-0.418371 32:-0.64912 33:-0.0165632 34:-0.297427 35:-0.434666 36:-0.641464
+1024 1:-0.835652 2:-0.741314 3:-0.704392 4:0.272382 5:-0.984962 6:-0.761259 7:-0.675876 8:0.336506 9:-0.984743 10:-0.811139 11:-0.551841 12:0.443032 13:-0.975004 14:-0.906045 15:-0.556076 16:0.455819 17:-0.975094 18:-0.906043 19:-0.397418 20:-0.503647 21:-0.362356 22:0.32762 23:-0.84326 24:-0.452645 25:-0.423126 26:0.392417 27:-0.860387 28:-0.505028 29:-0.233123 30:-0.0504308 31:-0.748311 32:-0.826231 33:-0.235671 34:-0.0379865 35:-0.750785 36:-0.823471
+1024 1:-0.818679 2:-0.732314 3:-0.646403 4:0.208224 5:-0.982511 6:-0.803959 7:-0.667384 8:0.380166 9:-0.981595 10:-0.776399 11:-0.542695 12:0.42721 13:-0.97083 14:-0.901412 15:-0.545397 16:0.423847 17:-0.969805 18:-0.90328 19:-0.255041 20:-0.434113 21:-0.287064 22:0.256231 23:-0.803975 24:-0.442937 25:-0.284868 26:0.448246 27:-0.827897 28:-0.378465 29:-0.0802581 30:-0.00502253 31:-0.728022 32:-0.786574 33:-0.0726148 34:-0.0671697 35:-0.715786 36:-0.796878
+-1024 1:-0.827579 2:-0.713541 3:-0.702978 4:0.148329 5:-0.970534 6:-0.783569 7:-0.715506 8:0.39464 9:-0.978512 10:-0.741089 11:-0.596046 12:0.355102 13:-0.960681 14:-0.897672 15:-0.603366 16:0.37013 17:-0.960441 18:-0.896665 19:-0.398042 20:-0.406321 21:-0.385886 22:0.0910771 23:-0.73301 24:-0.470359 25:-0.432662 26:0.398222 27:-0.786594 28:-0.311495 29:-0.278984 30:-0.204893 31:-0.632476 32:-0.785786 33:-0.284079 34:-0.119296 35:-0.656247 36:-0.771518
+1024 1:-0.82613 2:-0.740492 3:-0.663376 4:0.196591 5:-0.98243 6:-0.806805 7:-0.684368 8:0.363498 9:-0.982078 10:-0.783435 11:-0.564034 12:0.410232 13:-0.97097 14:-0.90496 15:-0.568278 16:0.407049 17:-0.969921 18:-0.906756 19:-0.310618 20:-0.454475 21:-0.343533 22:0.240002 23:-0.805907 24:-0.461003 25:-0.346845 26:0.409139 27:-0.826074 28:-0.4088 29:-0.146498 30:-0.0280771 31:-0.732905 32:-0.801408 33:-0.138855 34:-0.102343 35:-0.717745 36:-0.813193
+-1024 1:-0.859043 2:-0.653528 3:-0.751067 4:0.273465 5:-0.955702 6:-0.623118 7:-0.749476 8:-0.331029 9:-0.814465 10:-0.827972 11:-0.762201 12:0.322067 13:-0.90005 14:-0.794917 15:-0.754392 16:-0.417584 17:-0.761885 18:-0.914051 19:-0.54479 20:-0.483803 21:-0.522355 22:0.190897 23:-0.790856 24:-0.469265 25:-0.544698 26:-0.441436 27:-0.523331 28:-0.744846 29:-0.521019 30:0.0355444 31:-0.672193 32:-0.731178 33:-0.480257 34:-0.296596 35:-0.60376 36:-0.814619
+-1024 1:-0.827786 2:-0.468447 3:-0.681763 4:0.0649242 5:-0.800819 6:-0.445481 7:-0.681538 8:0.0577109 9:-0.771827 10:-0.496799 11:-0.594522 12:-0.301249 13:-0.707966 14:-0.804025 15:-0.594212 16:-0.42336 17:-0.676575 18:-0.827378 19:-0.347461 20:-0.207184 21:-0.223534 22:-0.0466796 23:-0.497561 24:-0.202798 25:-0.289636 26:-0.106845 27:-0.432636 28:-0.2445 29:-0.151594 30:-0.499237 31:-0.307066 32:-0.644921 33:-0.138855 34:-0.628729 35:-0.277426 36:-0.681828
+1024 1:-0.824681 2:-0.654816 3:-0.729851 4:0.0571401 5:-0.912998 6:-0.684602 7:-0.718337 8:0.170221 9:-0.931944 10:-0.735915 11:-0.644826 12:0.0645663 13:-0.897861 14:-0.885166 15:-0.650657 16:0.0556647 17:-0.893488 18:-0.887789 19:-0.334347 20:-0.391163 21:-0.376474 22:0.221596 23:-0.728014 24:-0.304922 25:-0.401672 26:-0.0561688 27:-0.638445 28:-0.523578 29:-0.263697 30:-0.271624 31:-0.5935 32:-0.779398 33:-0.266245 34:-0.297935 35:-0.586599 36:-0.783732
+-1024 1:-0.909548 2:-0.851166 3:-0.865633 4:-0.0568482 5:-0.975221 6:-0.886086 7:-0.862707 8:0.0192099 9:-0.976958 10:-0.9054 11:-0.824699 12:0.140332 13:-0.966193 14:-0.9526 15:-0.829142 16:0.169643 17:-0.96784 18:-0.952166 19:-0.71901 20:-0.725282 21:-0.738824 22:0.141972 23:-0.898311 24:-0.743111 25:-0.771157 26:0.176822 27:-0.90783 28:-0.789401 29:-0.696817 30:0.0988678 31:-0.870318 32:-0.904095 33:-0.694269 34:0.0850908 35:-0.870922 36:-0.907903
+-1024 1:-0.846002 2:-0.744333 3:-0.785013 4:0.0899306 5:-0.956679 6:-0.758537 7:-0.766461 8:0.0712779 9:-0.947711 10:-0.822999 11:-0.71647 12:0.114892 13:-0.934221 14:-0.917053 15:-0.710152 16:0.156921 17:-0.938934 18:-0.916803 19:-0.582258 20:-0.555479 21:-0.581179 22:0.405928 23:-0.88657 24:-0.441682 25:-0.597142 26:0.0624057 27:-0.768694 28:-0.632782 29:-0.482805 30:-0.117162 31:-0.736152 32:-0.85093 33:-0.50828 34:-0.131736 35:-0.72464 36:-0.847521
+-1024 1:-0.846623 2:-0.827018 3:-0.756725 4:0.113794 5:-0.990346 6:-0.866469 7:-0.736737 8:0.193497 9:-0.988502 10:-0.882712 11:-0.678362 12:0.399449 13:-0.98366 14:-0.929619 15:-0.687269 16:0.373881 17:-0.981309 18:-0.933044 19:-0.58538 20:-0.66472 21:-0.529414 22:0.417406 23:-0.953005 24:-0.660809 25:-0.556617 26:0.343899 27:-0.926207 28:-0.70962 29:-0.50828 30:0.387678 31:-0.897553 32:-0.832138 33:-0.513376 34:0.242882 35:-0.874839 36:-0.85574
+1024 1:-0.832547 2:-0.457807 3:-0.619529 4:-0.146396 5:-0.793083 6:-0.622951 7:-0.623506 8:-0.0884737 9:-0.790779 10:-0.643956 11:-0.605192 12:-0.228445 13:-0.706307 14:-0.77631 15:-0.607942 16:-0.262833 17:-0.696038 18:-0.788152 19:-0.551659 20:-0.386582 21:-0.440004 22:-0.0928933 23:-0.65788 24:-0.539896 25:-0.466035 26:-0.146649 27:-0.634039 28:-0.610407 29:-0.411467 30:-0.136188 31:-0.578379 32:-0.718955 33:-0.416562 34:-0.169875 35:-0.574579 36:-0.730966
+-1024 1:-0.589333 2:-0.262924 3:-0.282905 4:-0.0511743 5:-0.725994 6:-0.451403 7:-0.227198 8:-0.116177 9:-0.717011 10:-0.550529 11:-0.161606 12:-0.170569 13:-0.674421 14:-0.697981 15:-0.168594 16:-0.167447 17:-0.673057 18:-0.703149 19:0.145841 20:-0.212314 21:0.00705298 22:0.240391 23:-0.690481 24:-0.223921 25:0.0583949 26:-0.0559384 27:-0.605492 28:-0.446322 29:0.243309 30:-0.150213 31:-0.553644 32:-0.670743 33:0.220379 34:-0.178759 35:-0.542652 36:-0.673272
+-1024 1:-0.762582 2:-0.567395 3:-0.512036 4:0.559402 5:-0.98086 6:-0.601838 7:-0.507444 8:0.655576 9:-0.979315 10:-0.645659 11:-0.260689 12:0.710307 13:-0.966004 14:-0.828324 15:-0.264702 16:0.726265 17:-0.966647 18:-0.827419 19:0.0702909 20:-0.147635 21:0.0776393 22:0.421635 23:-0.683365 24:0.0160728 25:0.044092 26:0.626184 27:-0.749454 28:-0.00343194 29:0.278979 30:-0.0609511 31:-0.532672 32:-0.609974 33:0.28917 34:-0.0528361 35:-0.538284 36:-0.611509
+-1024 1:-0.843104 2:-0.803164 3:-0.736923 4:0.0970639 5:-0.984755 6:-0.85615 7:-0.728245 8:0.160425 9:-0.980759 10:-0.871264 11:-0.690556 12:0.33073 13:-0.971991 14:-0.920931 15:-0.699473 16:0.250299 17:-0.963933 18:-0.929823 19:-0.57414 20:-0.644435 21:-0.541178 22:0.297789 23:-0.917728 24:-0.661035 25:-0.547082 26:0.264723 27:-0.897463 28:-0.704838 29:-0.526115 30:0.308477 31:-0.859763 32:-0.813395 33:-0.523567 34:0.136758 35:-0.828413 36:-0.843627
+-1024 1:-0.724495 2:-0.41189 3:-0.516279 4:-0.0508824 5:-0.774588 6:-0.522702 7:-0.486213 8:-0.108906 9:-0.761992 10:-0.614176 11:-0.416173 12:-0.127716 13:-0.751368 14:-0.774733 15:-0.418779 16:-0.129665 17:-0.749481 18:-0.780553 19:-0.107044 20:-0.292273 21:-0.176475 22:0.277423 23:-0.734765 24:-0.267084 25:-0.144226 26:-0.018663 27:-0.6447 28:-0.483167 29:0.00636694 30:-0.173219 31:-0.588575 32:-0.720716 33:-0.0165632 34:-0.197341 35:-0.579536 36:-0.723181
+1024 1:-0.808329 2:-0.718603 3:-0.635087 4:0.308998 5:-0.984808 6:-0.756944 7:-0.606521 8:0.33962 9:-0.982319 10:-0.812586 11:-0.486293 12:0.499353 13:-0.97489 14:-0.895048 15:-0.493529 16:0.505656 17:-0.974229 18:-0.89457 19:-0.175735 20:-0.41908 21:-0.18824 22:0.458038 23:-0.847615 24:-0.33896 25:-0.263414 26:0.287598 27:-0.80112 28:-0.482619 29:-0.0216588 30:-0.0134565 31:-0.729158 32:-0.789334 33:-0.0165632 34:0.0409855 35:-0.741844 36:-0.781231
+-1024 1:-0.868978 2:-0.690065 3:-0.794914 4:0.247783 5:-0.963404 6:-0.651149 7:-0.793353 8:-0.321964 9:-0.82533 10:-0.841064 11:-0.780494 12:0.201319 13:-0.90473 14:-0.848936 15:-0.778801 16:-0.284774 17:-0.816051 18:-0.919991 19:-0.593498 20:-0.525085 21:-0.585885 22:0.222704 23:-0.824476 24:-0.50119 25:-0.606677 26:-0.393226 27:-0.556323 28:-0.755515 29:-0.582166 30:0.0224463 31:-0.699203 32:-0.767084 33:-0.559236 34:-0.255998 35:-0.637386 36:-0.831746
+-1024 1:-0.736708 2:-0.280893 3:-0.444145 4:0.786631 5:-0.896371 6:-0.0614542 7:-0.406951 8:-0.514787 9:-0.607819 10:-0.651869 11:-0.394832 12:-0.398815 13:-0.581911 14:-0.703181 15:-0.405049 16:-0.454938 17:-0.563477 18:-0.718685 19:-0.148258 20:-0.0874333 21:-0.0094177 22:0.650508 23:-0.71399 24:0.274704 25:0.0226402 26:-0.815393 27:-0.218599 28:-0.623677 29:0.11592 30:-0.563422 31:-0.247798 32:-0.604911 33:0.0980851 34:-0.607147 35:-0.232508 36:-0.615201
+1024 1:-0.831305 2:-0.629144 3:-0.693077 4:0.485946 5:-0.980528 6:-0.581097 7:-0.682953 8:0.0802008 9:-0.9147 10:-0.756912 11:-0.591473 12:0.309824 13:-0.928207 14:-0.855288 15:-0.60184 16:0.277059 17:-0.921303 18:-0.858817 19:-0.271277 20:-0.251971 21:-0.249417 22:0.547621 23:-0.775877 24:-0.00364927 25:-0.234808 26:-0.213289 27:-0.483814 28:-0.434268 29:-0.143951 30:-0.206536 31:-0.506606 32:-0.669985 33:-0.136307 34:-0.278393 35:-0.4851 36:-0.683465
+1024 1:-0.877672 2:-0.675823 3:-0.763797 4:0.341521 5:-0.973691 6:-0.63004 7:-0.742399 8:0.0544003 9:-0.932928 10:-0.802909 11:-0.689031 12:0.159387 13:-0.920669 14:-0.886957 15:-0.69032 16:0.260082 17:-0.931936 18:-0.877615 19:-0.599118 20:-0.459541 21:-0.510592 22:0.498904 23:-0.857641 24:-0.247944 25:-0.511327 26:-0.182464 27:-0.647274 28:-0.666818 29:-0.459875 30:-0.250309 31:-0.596185 32:-0.788978 33:-0.477709 34:-0.0676212 35:-0.643729 36:-0.749501
+925.2770589880586 1:-0.797979 2:-0.423487 3:-0.586998 4:0.0478721 5:-0.815654 6:-0.510571 7:-0.58529 8:0.0105781 9:-0.779305 10:-0.557411 11:-0.560986 12:-0.0701184 13:-0.721861 14:-0.726244 15:-0.560651 16:-0.193122 17:-0.69149 18:-0.752009 19:-0.320609 20:-0.211437 21:-0.183534 22:-0.177502 23:-0.522362 24:-0.396967 25:-0.184748 26:-0.257239 27:-0.449687 28:-0.424838 29:-0.151594 30:-0.103322 31:-0.448124 32:-0.564552 33:-0.151594 34:-0.217358 35:-0.418842 36:-0.597007
+1024 1:-0.684751 2:-0.339758 3:-0.415857 4:0.124035 5:-0.820437 6:-0.481994 7:-0.415444 8:0.34115 9:-0.84011 10:-0.458258 11:-0.399405 12:-0.0596658 13:-0.700811 14:-0.691767 15:-0.398947 16:-0.0904477 17:-0.692509 18:-0.702526 19:0.0684176 20:-0.121243 21:0.042345 22:-0.0156025 23:-0.558789 24:-0.264183 25:0.0655463 26:-0.00506539 27:-0.556597 28:-0.314374 29:0.11592 30:-0.053371 31:-0.444861 32:-0.523371 33:0.113372 34:-0.0751991 35:-0.440812 36:-0.530933
+1024 1:-0.722218 2:-0.401009 3:-0.483747 4:0.205238 5:-0.849633 6:-0.469529 7:-0.489044 8:0.273322 9:-0.843968 10:-0.504806 11:-0.414649 12:0.106802 13:-0.785949 14:-0.729454 15:-0.400473 16:0.0580913 17:-0.784091 18:-0.751776 19:0.131481 20:-0.0440445 21:0.0988159 22:0.0511215 23:-0.476786 24:-0.0316236 25:0.0726977 26:-0.0481226 27:-0.43275 28:-0.151937 29:0.143945 30:0.143535 31:-0.39933 32:-0.363192 33:0.210188 34:-0.27716 35:-0.305646 36:-0.504041
+1024 1:-0.818472 2:-0.79188 3:-0.686006 4:0.120161 5:-0.985024 6:-0.854017 7:-0.690029 8:0.268434 9:-0.987253 10:-0.852064 11:-0.5869 12:0.37446 13:-0.97898 14:-0.928894 15:-0.58811 16:0.397702 17:-0.979562 18:-0.928036 19:-0.355579 20:-0.556364 21:-0.378827 22:0.373871 23:-0.898523 24:-0.551602 25:-0.365915 26:0.38754 27:-0.894905 28:-0.606178 29:-0.200002 30:0.0843923 31:-0.824643 32:-0.852613 33:-0.189811 34:0.129728 35:-0.835302 36:-0.848128
+-1024 1:-0.801291 2:-0.474372 3:-0.635087 4:0.0293302 5:-0.840467 6:-0.568432 7:-0.633414 8:0.251015 9:-0.846287 10:-0.498618 11:-0.579278 12:-0.0820655 13:-0.780751 14:-0.797275 15:-0.580483 16:-0.0414815 17:-0.785557 18:-0.794838 19:-0.303125 20:-0.169117 21:-0.167063 22:-0.124162 23:-0.511209 24:-0.317411 25:-0.251495 26:0.196247 27:-0.559193 28:-0.0870388 29:-0.100641 30:-0.257672 31:-0.411127 32:-0.607401 33:-0.113377 34:-0.195189 35:-0.429684 36:-0.592955
+-1024 1:-0.790941 2:-0.567591 3:-0.612457 4:0.339027 5:-0.939038 6:-0.567163 7:-0.612183 8:0.169382 9:-0.910767 10:-0.704436 11:-0.518305 12:0.205898 13:-0.894744 14:-0.839159 15:-0.514886 16:0.206323 17:-0.895146 18:-0.844524 19:-0.200089 20:-0.22738 21:-0.183534 22:0.40849 23:-0.717501 24:-0.0548294 25:-0.163294 26:-0.145857 27:-0.541649 28:-0.444947 29:-0.0420412 30:-0.108309 31:-0.533003 32:-0.642856 33:-0.0420412 34:-0.228717 35:-0.504957 36:-0.67246
+-1024 1:-0.700897 2:-0.271843 3:-0.418686 4:0.090563 5:-0.759343 6:-0.385488 7:-0.404121 8:0.21313 9:-0.748415 10:-0.364578 11:-0.365869 12:-0.190771 13:-0.625616 14:-0.66062 15:-0.391319 16:-0.124299 17:-0.620467 18:-0.6361 19:-0.0414759 20:-0.0157087 21:0.127051 22:-0.315285 23:-0.334864 24:-0.241359 25:0.0583949 26:-0.0375086 27:-0.374012 28:-0.0408296 29:0.194901 30:-0.343687 31:-0.242697 32:-0.483151 33:0.0980851 34:-0.0851391 35:-0.274879 36:-0.371539
+690.7693750794288 1:-0.866081 2:-0.777981 3:-0.824615 4:0.192054 5:-0.984727 6:-0.770096 7:-0.80326 8:-0.0498712 9:-0.94172 10:-0.869376 11:-0.750006 12:0.254077 13:-0.959779 14:-0.916629 15:-0.766596 16:0.087201 17:-0.939056 18:-0.934847 19:-0.622848 20:-0.617179 21:-0.665885 22:0.277934 23:-0.891224 24:-0.590695 25:-0.680575 26:-0.2099 27:-0.69168 28:-0.773676 29:-0.645861 30:0.090458 31:-0.792818 32:-0.830749 33:-0.633122 34:-0.161298 35:-0.738908 36:-0.878847
+-1024 1:-0.868151 2:-0.81749 3:-0.823201 4:0.0227867 5:-0.978931 6:-0.85976 7:-0.825906 8:0.112162 9:-0.972675 10:-0.853917 11:-0.814028 12:0.216889 13:-0.959965 14:-0.924678 15:-0.816937 16:0.18216 17:-0.9553 18:-0.931577 19:-0.665934 20:-0.673151 21:-0.705883 22:0.0454067 23:-0.857752 24:-0.753647 25:-0.730632 26:0.152363 27:-0.854595 28:-0.705197 29:-0.689174 30:0.0981912 31:-0.819315 32:-0.855328 33:-0.689174 34:-0.0589791 35:-0.778088 36:-0.877364
+-1024 1:-0.819093 2:-0.658113 3:-0.649232 4:0.400704 5:-0.980029 6:-0.666531 7:-0.650399 8:0.432024 9:-0.974546 10:-0.7215 11:-0.510683 12:0.535585 13:-0.965186 14:-0.861851 15:-0.514886 16:0.554409 17:-0.965419 18:-0.859849 19:-0.380557 20:-0.286146 21:-0.343533 22:0.34616 23:-0.684477 24:-0.0374565 25:-0.384986 26:0.218666 27:-0.65577 28:-0.246298 29:-0.174524 30:-0.355891 31:-0.500269 32:-0.733236 33:-0.164333 34:-0.274886 35:-0.530243 36:-0.721695
+1024 1:-0.848693 2:-0.695008 3:-0.770869 4:0.343473 5:-0.970116 6:-0.60791 7:-0.743814 8:-0.0241994 9:-0.927562 10:-0.834011 11:-0.69208 12:0.115274 13:-0.921376 14:-0.900458 15:-0.679642 16:0.210244 17:-0.932865 18:-0.894123 19:-0.538545 20:-0.412517 21:-0.465886 22:0.708333 23:-0.880984 24:-0.03837 25:-0.520862 26:-0.403329 27:-0.499492 28:-0.682662 29:-0.39618 30:-0.526037 31:-0.500074 32:-0.822723 33:-0.391084 34:-0.406968 35:-0.543031 36:-0.804805
+-1024 1:-0.71932 2:-0.233648 3:-0.401713 4:0.497464 5:-0.812582 6:-0.124553 7:-0.387136 8:-0.207267 9:-0.632583 10:-0.484181 11:-0.315566 12:-0.399232 13:-0.575024 14:-0.690494 15:-0.324196 16:-0.355323 17:-0.578116 18:-0.685932 19:-0.120782 20:-0.0499932 21:-0.00706475 22:0.281497 23:-0.548397 24:0.142131 25:0.0178726 26:-0.417455 27:-0.27378 28:-0.325113 29:0.149041 30:-0.488144 31:-0.250518 32:-0.567801 33:0.143945 34:-0.545323 35:-0.227944 36:-0.578113
+-1024 1:-0.85304 2:-0.662333 3:-0.78077 4:-0.0325229 5:-0.880121 6:-0.673266 7:-0.757968 8:0.31229 9:-0.940168 10:-0.649058 11:-0.698177 12:-0.0334431 13:-0.868779 14:-0.882121 15:-0.708626 16:-0.118464 17:-0.848382 18:-0.893241 19:-0.494833 20:-0.400964 21:-0.470592 22:0.113718 23:-0.65855 24:-0.294577 25:-0.497024 26:0.00136819 27:-0.590652 28:-0.390474 29:-0.309555 30:-0.583963 31:-0.451598 32:-0.802929 33:-0.312103 34:-0.670423 35:-0.42906 36:-0.820326
+-724.0794067186615 1:-0.828821 2:-0.711378 3:-0.727022 4:0.0754448 5:-0.950841 6:-0.766609 7:-0.723998 8:0.0883857 9:-0.93867 10:-0.79895 11:-0.690556 12:0.183125 13:-0.926394 14:-0.887885 15:-0.691845 16:0.121309 17:-0.915753 18:-0.896488 19:-0.363072 20:-0.463768 21:-0.371768 22:0.186466 23:-0.803228 24:-0.511189 25:-0.380218 26:0.0709913 27:-0.746476 28:-0.579923 29:-0.312103 30:0.109517 31:-0.735106 32:-0.753846 33:-0.301914 34:0.0742962 35:-0.728567 36:-0.761
+-776.2168081652 1:-0.732154 2:-0.380047 3:-0.456875 4:0.890111 5:-0.942885 6:-0.137757 7:-0.455075 8:-0.612715 9:-0.642699 10:-0.749118 11:-0.37654 12:-0.697205 13:-0.625776 14:-0.841378 15:-0.374539 16:-0.490354 17:-0.664584 18:-0.817613 19:0.160203 20:-0.132128 21:0.190581 22:0.992112 23:-0.823836 24:0.380322 25:0.196656 26:-0.619115 27:-0.359985 28:-0.591634 29:0.408914 30:-0.935243 31:-0.27961 32:-0.760226 33:0.40127 34:-0.79783 35:-0.311052 36:-0.730734
+-1024 1:-0.862355 2:-0.853258 3:-0.831687 4:0.0213758 5:-0.988287 6:-0.890735 7:-0.830153 8:0.145579 9:-0.9872 10:-0.88267 11:-0.792689 12:0.27308 13:-0.978825 14:-0.943146 15:-0.797107 16:0.272428 17:-0.977974 18:-0.945269 19:-0.662812 20:-0.714645 21:-0.731765 22:0.0993856 23:-0.896763 24:-0.779583 25:-0.754471 26:0.186953 27:-0.888139 28:-0.740059 29:-0.707006 30:0.132692 31:-0.857941 32:-0.880819 33:-0.714649 34:0.00674761 35:-0.82582 36:-0.895176
+1024 1:-0.732775 2:-0.422537 3:-0.517693 4:0.0419854 5:-0.821611 6:-0.537064 7:-0.523013 8:0.412899 9:-0.864757 10:-0.441361 11:-0.419222 12:-0.0659044 13:-0.778268 14:-0.783079 15:-0.418779 16:-0.0458305 17:-0.780457 18:-0.785209 19:0.122115 20:-0.0655795 21:0.0705804 22:0.00391727 23:-0.450995 24:-0.045533 25:0.0583949 26:0.233573 27:-0.535652 28:-0.00211695 29:0.220379 30:-0.298319 31:-0.345599 32:-0.550391 33:0.256048 34:-0.362226 35:-0.340285 36:-0.575319
+1024 1:-0.853661 2:-0.759641 3:-0.785013 4:0.156928 5:-0.978375 6:-0.783539 7:-0.783446 8:0.145328 9:-0.961218 10:-0.812496 11:-0.719519 12:0.27997 13:-0.958579 14:-0.908849 15:-0.72083 16:0.259834 17:-0.955878 18:-0.914002 19:-0.577262 20:-0.52158 21:-0.571767 22:0.101915 23:-0.784524 24:-0.55626 25:-0.587607 26:-0.00077821 27:-0.711103 28:-0.598199 29:-0.526115 30:0.0262968 31:-0.722851 32:-0.784813 33:-0.518472 34:-0.0810035 35:-0.700647 36:-0.808557
+-1024 1:-0.865874 2:-0.769572 3:-0.799156 4:-0.006282 5:-0.960476 6:-0.829901 7:-0.794769 8:0.123475 9:-0.958189 10:-0.814606 11:-0.797262 12:0.18746 13:-0.936392 14:-0.898604 15:-0.797107 16:0.110133 17:-0.926034 18:-0.913266 19:-0.654694 20:-0.627696 21:-0.656473 22:0.0102286 23:-0.828712 24:-0.733122 25:-0.69011 26:0.143283 27:-0.829702 28:-0.66634 29:-0.653505 30:0.106182 31:-0.790133 32:-0.82177 33:-0.638218 34:-0.100085 35:-0.740799 36:-0.858552
+1024 1:-0.740848 2:-0.462809 3:-0.51345 4:0.0509858 5:-0.854935 6:-0.598044 7:-0.514521 8:-0.000626225 9:-0.823829 10:-0.647376 11:-0.51678 12:0.106246 13:-0.792082 14:-0.737748 15:-0.514886 16:-0.00370813 17:-0.769601 18:-0.760096 19:-0.138891 20:-0.284269 21:-0.145889 22:0.0997798 23:-0.695831 24:-0.390939 25:-0.144226 26:-0.0268946 27:-0.637417 28:-0.478977 29:-0.0929971 30:0.133143 31:-0.619784 32:-0.615562 33:-0.0879015 34:0.106253 35:-0.613589 36:-0.621791
+1024 1:-0.683509 2:-0.269403 3:-0.394641 4:0.394787 5:-0.823927 6:-0.257589 7:-0.36732 8:-0.112798 9:-0.709482 10:-0.535553 11:-0.349101 12:-0.278632 13:-0.626527 14:-0.696479 15:-0.360809 16:0.0188653 17:-0.677635 18:-0.637857 19:0.0272083 20:-0.097961 21:0.0305825 22:0.43266 23:-0.668528 24:0.0730155 25:0.0774653 26:-0.345843 27:-0.440514 28:-0.470345 29:0.149041 30:-0.259315 31:-0.383087 32:-0.568364 33:0.123563 34:-0.105285 35:-0.418713 36:-0.524538
+1024 1:-0.808122 2:-0.553471 3:-0.654889 4:0.0746056 5:-0.888969 6:-0.63183 7:-0.636245 8:0.436783 9:-0.931972 10:-0.575366 11:-0.565559 12:0.185644 13:-0.878576 14:-0.82255 15:-0.566753 16:0.222779 17:-0.882188 18:-0.820278 19:-0.263159 20:-0.181703 21:-0.192946 22:0.0628014 23:-0.581186 24:-0.214167 25:-0.194284 26:0.278501 27:-0.624199 28:-0.116601 29:-0.0955449 30:-0.148666 31:-0.464918 32:-0.601062 33:-0.103188 34:-0.130929 35:-0.469125 36:-0.596168
+-1024 1:-0.695929 2:-0.581016 3:-0.444145 4:-0.0724224 5:-0.902684 6:-0.775758 7:-0.432429 8:0.036514 9:-0.905637 10:-0.779091 11:-0.408551 12:0.0824999 13:-0.877187 14:-0.852916 15:-0.401998 16:0.024812 17:-0.871297 18:-0.866757 19:0.0072284 20:-0.468686 21:-0.108242 22:0.194401 23:-0.843321 24:-0.602079 25:-0.09655 26:0.170259 27:-0.825352 28:-0.641589 29:-0.0140154 30:0.303144 31:-0.798663 32:-0.744255 33:-0.0140154 34:0.14016 35:-0.772193 36:-0.775992
+-1024 1:-0.852419 2:-0.665839 3:-0.782184 4:-0.0357886 5:-0.880601 6:-0.676667 7:-0.759384 8:0.311092 9:-0.941105 10:-0.652212 11:-0.699702 12:-0.0352938 13:-0.869597 14:-0.883627 15:-0.710152 16:-0.118472 17:-0.849401 18:-0.894326 19:-0.49296 20:-0.404349 21:-0.472945 22:0.128231 23:-0.665815 24:-0.289597 25:-0.497024 26:0.0166065 27:-0.600501 28:-0.389119 29:-0.304459 30:-0.609539 31:-0.449769 32:-0.810796 33:-0.307007 34:-0.68728 35:-0.429738 36:-0.826182
+1024 1:-0.826544 2:-0.739001 3:-0.599727 4:0.128638 5:-0.982239 6:-0.851556 7:-0.709844 8:0.578446 9:-0.99708 10:-0.743456 11:-0.518305 12:0.570844 13:-0.983413 14:-0.897136 15:-0.520988 16:0.567396 17:-0.982757 18:-0.898782 19:-0.25067 20:-0.38864 21:-0.247064 22:-0.0741298 23:-0.673606 24:-0.534364 25:-0.249111 26:0.975058 27:-0.932268 28:-0.129627 29:-0.0853537 30:0.0139076 31:-0.697766 32:-0.749959 33:-0.0879015 34:-0.0412031 35:-0.686302 36:-0.76018
+1024 1:-0.807294 2:-0.598967 3:-0.669033 4:-0.0204636 5:-0.886422 6:-0.69513 7:-0.658891 8:0.399474 9:-0.939532 10:-0.618452 11:-0.591473 12:0.0280039 13:-0.869853 14:-0.861765 15:-0.589636 16:0.153896 17:-0.887192 18:-0.849671 19:-0.347461 20:-0.356593 21:-0.312947 22:0.0593342 23:-0.675269 24:-0.396184 25:-0.29202 26:0.294689 27:-0.751219 28:-0.363527 29:-0.194906 30:-0.234037 31:-0.575591 32:-0.744893 33:-0.20255 34:-0.173438 35:-0.58455 36:-0.726742
+-1024 1:-0.815988 2:-0.682249 3:-0.714293 4:0.0488087 5:-0.929943 6:-0.734314 7:-0.697106 8:0.187702 9:-0.94384 10:-0.759039 11:-0.66007 12:-0.00237204 13:-0.893612 14:-0.89917 15:-0.656759 16:0.224291 17:-0.923256 18:-0.876575 19:-0.40616 20:-0.450572 21:-0.463533 22:0.185113 23:-0.751449 24:-0.394959 25:-0.382602 26:0.106311 27:-0.74396 28:-0.541868 29:-0.273888 30:-0.201171 31:-0.661414 32:-0.807871 33:-0.304459 34:-0.106365 35:-0.675287 36:-0.784159
+-1024 1:-0.752646 2:-0.198274 3:-0.45829 4:0.168215 5:-0.693079 6:-0.172877 7:-0.469228 8:0.329174 9:-0.722941 10:-0.203503 11:-0.42532 12:-0.227698 13:-0.533484 14:-0.581349 15:-0.423356 16:-0.346941 17:-0.499738 18:-0.612766 19:-0.196342 20:0.0885277 21:-5.88239e-06 22:-0.0543491 23:-0.249828 24:0.231304 25:-0.0226521 26:-0.139356 27:-0.270222 28:-0.00228132 29:0.0649663 30:-0.386784 31:-0.0758224 32:-0.367939 33:0.0624185 34:-0.456604 35:-0.0401212 36:-0.379675
+-1024 1:-0.922381 2:-0.642742 3:-0.826029 4:-0.184185 5:-0.83523 6:-0.711244 7:-0.834399 8:-0.0347538 9:-0.865567 10:-0.718692 11:-0.795737 12:-0.198599 13:-0.797883 14:-0.873678 15:-0.801682 16:-0.207955 17:-0.790446 18:-0.878482 19:-0.791443 20:-0.533523 21:-0.670591 22:-0.168203 23:-0.640698 24:-0.624657 25:-0.7211 26:-0.0668277 27:-0.693514 28:-0.652696 29:-0.681531 30:-0.173775 31:-0.606907 32:-0.78915 33:-0.686626 34:-0.204387 35:-0.601483 36:-0.798649
+1024 1:-0.75306 2:-0.341606 3:-0.489405 4:0.365737 5:-0.845859 6:-0.315672 7:-0.470643 8:-0.152999 9:-0.727741 10:-0.589853 11:-0.458854 12:-0.292812 13:-0.647932 14:-0.73105 15:-0.473697 16:0.00985988 17:-0.701249 18:-0.672788 19:-0.217573 20:-0.161071 21:-0.127065 22:0.407611 23:-0.686303 24:0.0139797 25:-0.0893986 26:-0.342865 27:-0.456184 28:-0.505272 29:-0.0420412 30:-0.2658 31:-0.406558 32:-0.603518 33:-0.0675192 34:-0.102012 35:-0.443941 36:-0.55538
+1024 1:-0.825302 2:-0.75465 3:-0.673276 4:0.239476 5:-0.988332 6:-0.806489 7:-0.677291 8:0.38655 9:-0.990953 10:-0.815342 11:-0.548792 12:0.588187 13:-0.985721 14:-0.897485 15:-0.548448 16:0.559715 17:-0.983971 18:-0.902405 19:-0.258163 20:-0.433491 21:-0.261182 22:0.384145 23:-0.843469 24:-0.402072 25:-0.280101 26:0.376752 27:-0.832933 28:-0.466624 29:-0.110829 30:0.351404 31:-0.804198 32:-0.730705 33:-0.0904493 34:0.140104 35:-0.767957 36:-0.772229
+1024 1:-0.837101 2:-0.75517 3:-0.751067 4:0.0826938 5:-0.96717 6:-0.803113 7:-0.753722 8:0.105582 9:-0.956107 10:-0.826341 11:-0.701226 12:0.230383 13:-0.950345 14:-0.90954 15:-0.7071 16:0.198915 17:-0.944638 18:-0.913649 19:-0.396794 20:-0.509106 21:-0.416474 22:0.219449 23:-0.83563 24:-0.546045 25:-0.427894 26:0.135984 27:-0.794651 28:-0.60852 29:-0.352867 30:0.147498 31:-0.781301 32:-0.787689 33:-0.335033 34:0.0637919 35:-0.764596 36:-0.80221
+-1024 1:-0.855731 2:-0.655349 3:-0.775112 4:0.486925 5:-0.976996 6:-0.501829 7:-0.783446 8:-0.375055 9:-0.818971 10:-0.859703 11:-0.734762 12:-0.134224 13:-0.854508 14:-0.899527 15:-0.743713 16:-0.143541 17:-0.846974 18:-0.901924 19:-0.526681 20:-0.391232 21:-0.418827 22:0.801697 23:-0.903135 24:-0.00933149 25:-0.446965 26:-0.640871 27:-0.42028 28:-0.773238 29:-0.347772 30:-0.523459 31:-0.490083 32:-0.809565 33:-0.355415 34:-0.561365 35:-0.476796 36:-0.815136
+1024 1:-0.859664 2:-0.891486 3:-0.827444 4:0.0943882 5:-0.99772 6:-0.901426 7:-0.721168 8:0.0772423 9:-0.995627 10:-0.952886 11:-0.696653 12:0.361027 13:-0.994348 14:-0.963473 15:-0.699473 16:0.373881 17:-0.994345 18:-0.963691 19:-0.52231 20:-0.725323 21:-0.498828 22:0.58853 23:-0.990445 24:-0.695093 25:-0.485105 26:0.11016 27:-0.927437 28:-0.875474 29:-0.352867 30:0.175507 31:-0.919532 32:-0.925384 33:-0.357963 34:0.176042 35:-0.920218 36:-0.925258
+-1024 1:-0.843518 2:-0.862674 3:-0.826029 4:0.0739063 5:-0.993603 6:-0.887466 7:-0.753722 8:0.115317 9:-0.991228 10:-0.921639 11:-0.682934 12:0.360454 13:-0.990447 14:-0.953862 15:-0.685743 16:0.347582 17:-0.9893 18:-0.956228 19:-0.576638 20:-0.710346 21:-0.658826 22:0.387367 23:-0.961562 24:-0.695896 25:-0.687726 26:0.0267429 27:-0.854164 28:-0.809754 29:-0.600001 30:0.160185 31:-0.887662 32:-0.899138 33:-0.61274 34:0.0379946 35:-0.865361 36:-0.91425
+1024 1:-0.807087 2:-0.694059 3:-0.644988 4:0.182239 5:-0.968119 6:-0.766908 7:-0.650399 8:0.409263 9:-0.976936 10:-0.74474 11:-0.535073 12:0.356726 13:-0.95587 14:-0.890843 15:-0.531667 16:0.419814 17:-0.96045 18:-0.886396 19:-0.331225 20:-0.437164 21:-0.329417 22:0.252774 23:-0.802998 24:-0.44119 25:-0.318241 26:0.371448 27:-0.822864 28:-0.443991 29:-0.174524 30:-0.129752 31:-0.688359 32:-0.799335 33:-0.192359 34:-0.0344797 35:-0.704802 36:-0.778246
+-1024 1:-0.685786 2:-0.344023 3:-0.415857 4:0.127173 5:-0.809791 6:-0.457185 7:-0.421106 8:0.519337 9:-0.860232 10:-0.370343 11:-0.355198 12:-0.0705094 13:-0.732458 14:-0.73105 15:-0.353181 16:-0.0401999 17:-0.736631 18:-0.731483 19:0.25386 20:0.0161015 21:0.192934 22:0.0126998 23:-0.404188 24:0.0534088 25:0.18712 26:0.244793 27:-0.494666 28:0.0907148 29:0.329932 30:-0.316694 31:-0.263277 32:-0.479951 33:0.365601 34:-0.377801 35:-0.258373 36:-0.505831
+-1024 1:-0.675229 2:-0.281051 3:-0.370596 4:0.51797 5:-0.85815 6:-0.239185 7:-0.371566 8:-0.00191929 9:-0.74159 10:-0.512977 11:-0.300323 12:-0.0420971 13:-0.69644 14:-0.677023 15:-0.299789 16:-0.0799897 17:-0.689516 18:-0.691631 19:0.0777834 20:0.00332132 21:0.110581 22:0.344584 23:-0.572823 24:0.168539 25:0.177585 26:-0.322182 27:-0.360839 28:-0.329561 29:0.243309 30:-0.0672262 31:-0.346626 32:-0.427665 33:0.26624 34:-0.260351 35:-0.303381 36:-0.490393
+-1024 1:-0.831719 2:-0.944507 3:-0.895335 4:-0.0584354 5:-0.99754 6:-0.954655 7:-0.848553 8:0.0236848 9:-0.997018 10:-0.964468 11:-0.794213 12:0.285722 13:-0.996621 14:-0.978547 15:-0.794056 16:0.30669 17:-0.9969 18:-0.978615 19:-0.721508 20:-0.87269 21:-0.891764 22:0.212052 23:-0.981441 24:-0.881416 25:-0.911799 26:0.157482 27:-0.957207 28:-0.902571 29:-0.898091 30:0.221084 31:-0.949304 32:-0.949866 33:-0.898091 34:0.199647 35:-0.946366 36:-0.951671
+-1024 1:-0.675643 2:-0.318062 3:-0.38474 4:0.259642 5:-0.830832 6:-0.397828 7:-0.398459 8:0.249207 9:-0.798958 10:-0.439362 11:-0.36282 12:0.0164392 13:-0.701001 14:-0.659678 15:-0.348604 16:-0.00431476 17:-0.707949 18:-0.683724 19:0.227636 20:0.0203743 21:0.195287 22:-0.00535528 23:-0.413647 24:0.0124537 25:0.179969 26:-0.124775 27:-0.357818 28:-0.111924 29:0.228023 30:0.160878 31:-0.332747 32:-0.276837 33:0.286622 34:-0.264519 35:-0.233524 36:-0.426779
+-1024 1:-0.833789 2:-0.737884 3:-0.765211 4:0.0351804 5:-0.947707 6:-0.777253 7:-0.740983 8:0.17863 9:-0.962526 10:-0.805947 11:-0.707324 12:0.0852021 13:-0.926409 14:-0.914905 15:-0.702524 16:0.279101 17:-0.948812 18:-0.897952 19:-0.478598 20:-0.506889 21:-0.543531 22:0.210736 23:-0.795071 24:-0.452831 25:-0.47557 26:0.0717049 27:-0.763788 28:-0.611877 29:-0.380893 30:-0.114624 31:-0.718975 32:-0.829832 33:-0.408919 34:-0.0257489 35:-0.732594 36:-0.808898
+-1024 1:-0.651011 2:-0.215336 3:-0.373425 4:0.130098 5:-0.735699 6:-0.309245 7:-0.370151 8:0.595465 9:-0.801449 10:-0.167628 11:-0.245445 12:-0.101137 13:-0.653709 14:-0.650089 15:-0.246395 16:-0.177726 17:-0.637968 18:-0.671759 19:0.281333 20:0.165719 21:0.291756 22:-0.106102 23:-0.294613 24:0.0924615 25:0.225259 26:0.292632 27:-0.384266 28:0.378291 29:0.434392 30:-0.309613 31:-0.114882 32:-0.333562 33:0.442035 34:-0.471954 35:-0.0687105 36:-0.38537
+-1024 1:-0.781419 2:-0.557812 3:-0.585583 4:0.222528 5:-0.922066 6:-0.615298 7:-0.58529 8:0.359457 9:-0.930109 10:-0.637694 11:-0.48172 12:0.344258 13:-0.90678 14:-0.813963 15:-0.479799 16:0.253725 17:-0.895755 18:-0.831034 19:-0.0570873 20:-0.191042 21:-0.101183 22:0.179265 23:-0.617372 24:-0.139239 25:-0.110853 26:0.105609 27:-0.591049 28:-0.259224 29:-0.00382424 30:0.165365 31:-0.562608 32:-0.534701 33:0.0598707 34:-0.189892 35:-0.487062 36:-0.633646
+1024 1:-0.850763 2:-0.906393 3:-0.8133 4:0.0567874 5:-0.998835 6:-0.929867 7:-0.800429 8:0.145904 9:-0.998578 10:-0.940316 11:-0.734762 12:0.441607 13:-0.998166 14:-0.961029 15:-0.736086 16:0.453315 17:-0.998131 18:-0.96115 19:-0.552908 20:-0.719989 21:-0.557649 22:0.39387 23:-0.963611 24:-0.717571 25:-0.532781 26:0.421956 27:-0.969469 28:-0.785073 29:-0.44204 30:0.438491 31:-0.948637 32:-0.891503 33:-0.454779 34:0.429694 35:-0.947434 36:-0.891121
+1024 1:-0.806259 2:-0.406384 3:-0.579926 4:0.731437 5:-0.922116 6:-0.139975 7:-0.588121 8:-0.47393 9:-0.667382 10:-0.716492 11:-0.538121 12:-0.468577 13:-0.651811 14:-0.806848 15:-0.54082 16:-0.30094 17:-0.685339 18:-0.785775 19:-0.315614 20:-0.121042 21:-0.129418 22:0.791759 23:-0.748187 24:0.427592 25:-0.139459 26:-0.792946 27:-0.17535 28:-0.594891 29:-0.0343978 30:-0.773305 31:-0.17654 32:-0.665234 33:-0.0293022 34:-0.690787 35:-0.211255 36:-0.650799
+-1024 1:-0.871876 2:-0.785587 3:-0.811885 4:0.235505 5:-0.989127 6:-0.766716 7:-0.753722 8:0.0655573 9:-0.972274 10:-0.885726 11:-0.702751 12:0.3182 13:-0.972431 14:-0.924303 15:-0.710152 16:0.354571 17:-0.973779 18:-0.921166 19:-0.60911 20:-0.571552 21:-0.567061 22:0.65625 23:-0.951144 24:-0.368431 25:-0.609061 26:-0.0381829 27:-0.768928 28:-0.725051 29:-0.526115 30:-0.0803163 31:-0.747332 32:-0.849586 33:-0.531211 34:0.0403647 35:-0.775923 36:-0.828728
+-1024 1:-0.836273 2:-0.856492 3:-0.74541 4:-0.00641579 5:-0.987748 6:-0.914207 7:-0.78203 8:0.246032 9:-0.996182 10:-0.888769 11:-0.675313 12:0.367335 13:-0.988946 14:-0.949168 15:-0.675066 16:0.386842 17:-0.989286 18:-0.949004 19:-0.562275 20:-0.685375 21:-0.583532 22:0.271447 23:-0.929159 24:-0.712616 25:-0.599526 26:0.517987 27:-0.96347 28:-0.672959 29:-0.480257 30:0.219175 31:-0.902619 32:-0.896047 33:-0.477709 34:0.266164 35:-0.911222 36:-0.89105
+-1024 1:-0.834824 2:-0.616542 3:-0.714293 4:0.0539717 5:-0.891845 6:-0.642103 7:-0.692859 8:0.0229672 9:-0.879722 10:-0.721243 11:-0.672265 12:-0.0405766 13:-0.836794 14:-0.848444 15:-0.67354 16:-0.0643199 17:-0.830629 18:-0.856238 19:-0.571018 20:-0.475033 21:-0.55059 22:0.270046 23:-0.797237 24:-0.382104 25:-0.55185 26:-0.0132745 27:-0.69077 28:-0.575276 29:-0.470066 30:-0.18233 31:-0.635196 32:-0.792714 33:-0.480257 34:-0.230072 35:-0.620247 36:-0.800406
+-1024 1:-0.705451 2:-0.391662 3:-0.413029 4:0.518907 5:-0.916028 6:-0.402571 7:-0.421106 8:0.652245 9:-0.921721 10:-0.44689 11:-0.221055 12:0.423986 13:-0.876789 14:-0.747468 15:-0.220461 16:0.433869 17:-0.877337 18:-0.749043 19:0.281958 20:0.0358968 21:0.28705 22:0.300644 23:-0.52393 24:0.203842 25:0.21334 26:0.508806 27:-0.598236 28:0.207769 29:0.406366 30:-0.237396 31:-0.295227 32:-0.464295 33:0.419105 34:-0.268453 35:-0.29105 36:-0.476614
+1024 1:-0.767343 2:-0.455241 3:-0.545981 4:0.762366 5:-0.950566 6:-0.254299 7:-0.562645 8:-0.617759 9:-0.668784 10:-0.791858 11:-0.48172 12:-0.573338 13:-0.690028 14:-0.864224 15:-0.476748 16:-0.422667 17:-0.718274 18:-0.84999 19:0.0727886 20:-0.179668 21:0.119992 22:1 23:-0.856246 24:0.282836 25:0.0893843 26:-0.620189 27:-0.379046 28:-0.625291 29:0.319741 30:-0.825045 31:-0.354739 32:-0.774677 33:0.314645 34:-0.692311 35:-0.38445 36:-0.74682
+1024 1:-0.844553 2:-0.851887 3:-0.786428 4:-0.0174291 5:-0.985277 6:-0.906494 7:-0.813168 8:0.217097 9:-0.992631 10:-0.875175 11:-0.717994 12:0.309407 13:-0.98405 14:-0.948335 15:-0.726932 16:0.317139 17:-0.983431 18:-0.948332 19:-0.52231 20:-0.642525 21:-0.578826 22:0.11768 23:-0.873465 24:-0.719664 25:-0.616212 26:0.458074 27:-0.928406 28:-0.601063 29:-0.475162 30:0.171817 31:-0.868702 32:-0.87377 33:-0.492996 34:0.177389 35:-0.86778 36:-0.870261
+-20.95379225139311 1:-0.820748 2:-0.72711 3:-0.686006 4:0.259186 5:-0.981579 6:-0.759683 7:-0.677291 8:0.214991 9:-0.969578 10:-0.818458 11:-0.582327 12:0.39553 13:-0.965573 14:-0.897034 15:-0.589636 16:0.378571 17:-0.963213 18:-0.900145 19:-0.271277 20:-0.410997 21:-0.289417 22:0.417954 23:-0.827288 24:-0.314404 25:-0.325393 26:0.0713509 27:-0.732621 28:-0.553992 29:-0.161785 30:0.0545712 31:-0.719238 32:-0.756899 33:-0.161785 34:-0.010456 35:-0.708086 36:-0.771079
+-295.9156696340036 1:-0.800256 2:-0.513743 3:-0.603971 4:-0.0848283 5:-0.829631 6:-0.640825 7:-0.610767 8:0.360655 9:-0.896029 10:-0.538622 11:-0.594522 12:-0.187617 13:-0.74686 14:-0.800774 15:-0.600315 16:0.102324 17:-0.799711 18:-0.756221 19:-0.418025 20:-0.315683 21:-0.3153 22:-0.0478886 23:-0.599549 24:-0.38481 25:-0.318241 26:0.220312 27:-0.690123 28:-0.317113 29:-0.238219 30:-0.255505 31:-0.486384 32:-0.681867 33:-0.233123 34:-0.220051 35:-0.500094 36:-0.675987
+-1024 1:-0.843104 2:-0.609919 3:-0.743995 4:0.00420219 5:-0.864305 6:-0.617546 7:-0.732491 8:0.258455 9:-0.904972 10:-0.603305 11:-0.661594 12:-0.0859668 13:-0.835371 14:-0.861217 15:-0.668964 16:-0.189226 17:-0.809987 18:-0.875467 19:-0.449248 20:-0.339333 21:-0.397651 22:0.0329865 23:-0.590773 24:-0.272229 25:-0.442197 26:-0.00529576 27:-0.548301 28:-0.323524 29:-0.261149 30:-0.545233 31:-0.412445 32:-0.755528 33:-0.258601 34:-0.666884 35:-0.380684 36:-0.782331
+-1024 1:-0.734431 2:-0.298753 3:-0.480919 4:0.0561549 5:-0.757709 6:-0.404511 7:-0.473474 8:0.249552 9:-0.764752 10:-0.358206 11:-0.449708 12:-0.106333 13:-0.651925 14:-0.657575 15:-0.449288 16:-0.081596 17:-0.652201 18:-0.655405 19:-0.0933061 20:0.0226345 21:0.0141118 22:-0.109873 23:-0.39174 24:-0.090529 25:0.00833736 26:0.00494739 27:-0.375862 28:0.00711294 29:0.080253 30:-0.152726 31:-0.25603 32:-0.393769 33:0.0700619 34:-0.0996501 35:-0.272467 36:-0.379113
+1024 1:-0.842069 2:-0.747752 3:-0.752482 4:0.0170581 5:-0.955772 6:-0.811903 7:-0.743814 8:0.376964 9:-0.985424 10:-0.773595 11:-0.676838 12:0.330999 13:-0.960486 14:-0.90018 15:-0.681168 16:0.354486 17:-0.961326 18:-0.899 19:-0.408658 20:-0.425287 21:-0.414121 22:0.169561 23:-0.762142 24:-0.439463 25:-0.404056 26:0.434165 27:-0.827149 28:-0.376064 29:-0.281532 30:0.0777627 31:-0.727798 32:-0.758957 33:-0.291723 34:0.080673 35:-0.727734 36:-0.757236
+-1024 1:-0.84269 2:-0.892999 3:-0.814714 4:0.00323526 5:-0.993769 6:-0.923825 7:-0.830153 8:0.155172 9:-0.996833 10:-0.921288 11:-0.719519 12:0.358603 13:-0.993821 14:-0.961554 15:-0.719305 16:0.377324 17:-0.994122 18:-0.961694 19:-0.595996 20:-0.738345 21:-0.67765 22:0.254718 23:-0.939805 24:-0.746403 25:-0.647202 26:0.301622 27:-0.948407 28:-0.795349 29:-0.543949 30:0.295605 31:-0.932962 32:-0.909597 33:-0.564332 34:0.295726 35:-0.931005 36:-0.906057
+1024 1:-0.843725 2:-0.927507 3:-0.817543 4:-0.0176784 5:-0.998031 6:-0.951801 7:-0.827322 8:0.106983 9:-0.998928 10:-0.95203 11:-0.748482 12:0.377249 13:-0.998073 14:-0.970959 15:-0.752866 16:0.388004 17:-0.997988 18:-0.97108 19:-0.565397 20:-0.767946 21:-0.611767 22:0.3415 23:-0.973576 24:-0.791701 25:-0.563769 26:0.432884 27:-0.981237 28:-0.82003 29:-0.492996 30:0.460958 31:-0.965307 32:-0.911133 33:-0.495544 34:0.442464 35:-0.963688 36:-0.912595
+1024 1:-0.830063 2:-0.664386 3:-0.681763 4:-0.0170581 5:-0.921945 6:-0.765105 7:-0.688613 8:0.590611 9:-0.982686 10:-0.64197 11:-0.585375 12:0.166286 13:-0.920625 14:-0.885895 15:-0.578957 16:0.251401 17:-0.929683 18:-0.879665 19:-0.301251 20:-0.357389 21:-0.289417 22:-0.0444907 23:-0.616763 24:-0.409165 25:-0.34208 26:0.471272 27:-0.754163 28:-0.153013 29:-0.0777103 30:-0.539232 31:-0.51538 32:-0.807882 33:-0.0828059 34:-0.366967 35:-0.559251 36:-0.776861
+-1024 1:-0.837308 2:-0.937012 3:-0.886849 4:-0.0476836 5:-0.997304 6:-0.950502 7:-0.852799 8:0.0437782 9:-0.996937 10:-0.95789 11:-0.786591 12:0.293915 13:-0.996126 14:-0.976201 15:-0.786428 16:0.315106 17:-0.996499 18:-0.976438 19:-0.706522 20:-0.854275 21:-0.858825 22:0.193464 23:-0.973584 24:-0.874449 25:-0.876045 26:0.167343 27:-0.951981 28:-0.885924 29:-0.862422 30:0.215075 31:-0.941287 32:-0.942561 33:-0.867517 34:0.176695 35:-0.934327 36:-0.945407
+-1024 1:-0.844553 2:-0.919701 3:-0.806228 4:0.008301 5:-0.998135 6:-0.944528 7:-0.799014 8:0.11378 9:-0.998101 10:-0.947733 11:-0.754579 12:0.423213 13:-0.99823 14:-0.963518 15:-0.751341 16:0.432297 17:-0.998223 18:-0.964562 19:-0.652821 20:-0.781903 21:-0.632944 22:0.453447 23:-0.99039 24:-0.779884 25:-0.642434 26:0.516993 27:-0.990215 28:-0.794362 29:-0.533758 30:0.642816 31:-0.982307 32:-0.891834 33:-0.543949 34:0.586219 35:-0.97829 36:-0.897823
+-1024 1:-0.810399 2:-0.623841 3:-0.674691 4:0.0577178 5:-0.913602 6:-0.6969 7:-0.678707 8:0.0230011 9:-0.889103 10:-0.74028 11:-0.66007 12:0.125023 13:-0.875677 14:-0.838435 15:-0.656759 16:0.0215482 17:-0.856217 18:-0.855288 19:-0.315614 20:-0.381862 21:-0.296476 22:0.120535 23:-0.738069 24:-0.449347 25:-0.306322 26:0.000188231 27:-0.676716 28:-0.524908 29:-0.253506 30:0.056569 31:-0.657044 32:-0.695288 33:-0.233123 34:0.0267969 35:-0.654034 36:-0.70468
+-1024 1:-0.73857 2:-0.472375 3:-0.506378 4:0.246263 5:-0.904108 6:-0.566082 7:-0.484797 8:0.873455 9:-0.970666 10:-0.457924 11:-0.307944 12:0.456691 13:-0.906963 14:-0.783988 15:-0.308942 16:0.445916 17:-0.905427 18:-0.78873 19:0.0627983 20:-0.0293817 21:0.0635216 22:0.102017 23:-0.504264 24:-0.017393 25:0.00833736 26:0.504142 27:-0.588094 28:0.247663 29:0.261144 30:-0.284093 31:-0.333023 32:-0.529852 33:0.271335 34:-0.438224 35:-0.296386 36:-0.573616
+-1024 1:-0.868564 2:-0.551206 3:-0.707221 4:0.347578 5:-0.929005 6:-0.486181 7:-0.714091 8:-0.139682 9:-0.817834 10:-0.718692 11:-0.663118 12:-0.208383 13:-0.777654 14:-0.843849 15:-0.675066 16:-0.00327239 17:-0.813096 18:-0.81557 19:-0.559777 20:-0.354572 21:-0.437651 22:0.356663 23:-0.75156 24:-0.170687 25:-0.423126 26:-0.28775 27:-0.542195 28:-0.611145 29:-0.375797 30:-0.354054 31:-0.473452 32:-0.729947 33:-0.401275 34:-0.115072 35:-0.538229 36:-0.670958
+1024 1:-0.697378 2:-0.383022 3:-0.482333 4:0.206552 5:-0.813972 6:-0.387454 7:-0.480551 8:0.0271917 9:-0.775762 10:-0.546591 11:-0.403978 12:-0.298381 13:-0.692991 14:-0.773783 15:-0.411151 16:-0.163653 17:-0.714552 18:-0.755414 19:0.0777834 20:-0.136419 21:-0.00471179 22:0.209133 23:-0.561786 24:0.0118513 25:0.0250216 26:-0.145694 27:-0.462269 28:-0.309697 29:0.179615 30:-0.464357 31:-0.354249 32:-0.639153 33:0.149041 34:-0.325352 35:-0.378941 36:-0.596864
+1024 1:-0.789492 2:-0.391342 3:-0.588412 4:0.204673 5:-0.811319 6:-0.357628 7:-0.602275 8:0.30074 9:-0.82106 10:-0.408987 11:-0.483244 12:-0.0695971 13:-0.752212 14:-0.757072 15:-0.484375 16:-0.0502734 17:-0.751635 18:-0.756364 19:-0.220696 20:-0.0124817 21:-0.0635357 22:0.0866352 23:-0.390702 24:0.181495 25:-0.101318 26:0.0297546 27:-0.411544 28:-0.0273659 29:0.00381915 30:-0.404578 31:-0.186518 32:-0.481179 33:0.0191059 34:-0.427348 35:-0.180582 36:-0.4876
+-1024 1:-0.821783 2:-0.755351 3:-0.687419 4:0.197485 5:-0.985114 6:-0.812433 7:-0.681538 8:0.3566 9:-0.9886 10:-0.817345 11:-0.568607 12:0.463225 13:-0.97838 14:-0.907945 15:-0.566753 16:0.482134 17:-0.979161 18:-0.908314 19:-0.276897 20:-0.474706 21:-0.34118 22:0.28546 23:-0.839329 24:-0.491186 25:-0.353996 26:0.291958 27:-0.826654 28:-0.530133 29:-0.177072 30:0.055699 31:-0.760361 32:-0.798036 33:-0.192359 34:0.0379946 35:-0.754735 36:-0.798672
+1024 1:-0.852005 2:-0.879295 3:-0.814714 4:0.0194845 5:-0.992283 6:-0.910412 7:-0.7976 8:0.153175 9:-0.99483 10:-0.916901 11:-0.742384 12:0.34093 13:-0.990159 14:-0.95447 15:-0.745239 16:0.341089 17:-0.989411 18:-0.955544 19:-0.534799 20:-0.692168 21:-0.555296 22:0.331599 23:-0.947755 24:-0.717646 25:-0.568536 26:0.277759 27:-0.925105 28:-0.759141 29:-0.482805 30:0.288194 31:-0.911935 32:-0.885975 33:-0.475162 34:0.227097 35:-0.902416 36:-0.89257
+1024 1:-0.719734 2:-0.460851 3:-0.502135 4:0.267152 5:-0.87592 6:-0.477639 7:-0.514521 8:0.075191 9:-0.830968 10:-0.611998 11:-0.416173 12:-0.140411 13:-0.778929 14:-0.808546 15:-0.417253 16:-0.0330058 17:-0.794675 18:-0.796847 19:0.0846521 20:-0.182402 21:0.00705298 22:0.29298 23:-0.635692 24:-0.0365429 25:-0.00834928 26:-0.0309458 27:-0.547609 28:-0.328625 29:0.159232 30:-0.366226 31:-0.438063 32:-0.66567 33:0.133754 34:-0.238867 35:-0.460988 36:-0.629414
+-1024 1:-0.633419 2:-0.220081 3:-0.318265 4:0.432114 5:-0.806581 6:-0.196438 7:-0.341844 8:0.453715 9:-0.795996 10:-0.281933 11:-0.198191 12:-0.00543049 13:-0.691899 14:-0.655112 15:-0.200631 16:-0.0113465 17:-0.688358 18:-0.660488 19:0.378737 20:0.165112 21:0.369403 22:0.234788 23:-0.402412 24:0.365148 25:0.306308 26:0.145739 27:-0.360121 28:0.222374 29:0.480252 30:-0.328785 31:-0.111119 32:-0.339034 33:0.50318 34:-0.369304 35:-0.111075 36:-0.362054
+1024 1:-0.844553 2:-0.934169 3:-0.766626 4:-0.0659702 5:-0.997234 6:-0.966594 7:-0.800429 8:0.0690168 9:-0.997806 10:-0.959186 11:-0.722567 12:0.349949 13:-0.997951 14:-0.975592 15:-0.726932 16:0.362867 17:-0.997913 18:-0.975642 19:-0.626594 20:-0.818131 21:-0.616473 22:0.344344 23:-0.990088 24:-0.861207 25:-0.687726 26:0.460091 27:-0.990827 28:-0.831068 29:-0.523567 30:0.515146 31:-0.98333 32:-0.930638 33:-0.528663 34:0.512891 35:-0.983337 36:-0.930265
+-1024 1:-0.918241 2:-0.694099 3:-0.830273 4:-0.120532 5:-0.891328 6:-0.757144 7:-0.835814 8:0.0498441 9:-0.918167 10:-0.759544 11:-0.786591 12:-0.0566074 13:-0.874338 14:-0.899516 15:-0.791005 16:-0.0585355 17:-0.871166 18:-0.903197 19:-0.772086 20:-0.562216 21:-0.675297 22:-0.0985228 23:-0.697769 24:-0.632764 25:-0.718716 26:-0.00562727 27:-0.74248 28:-0.664502 29:-0.678983 30:-0.121342 31:-0.660823 32:-0.80792 33:-0.684078 34:-0.16445 35:-0.651396 36:-0.818918
+-1024 1:-0.799428 2:-0.343558 3:-0.588412 4:-0.0190528 5:-0.71783 6:-0.384125 7:-0.583875 8:0.0169758 9:-0.700572 10:-0.415591 11:-0.542695 12:-0.283124 13:-0.602497 14:-0.68897 15:-0.542346 16:-0.469908 17:-0.551094 18:-0.729952 19:-0.201338 20:-0.116601 21:-0.0894182 22:-0.138649 23:-0.405541 24:-0.156612 25:-0.160913 26:-0.185672 27:-0.346247 28:-0.191437 29:-0.0420412 30:-0.434971 31:-0.247223 32:-0.553421 33:-0.0114676 34:-0.582938 35:-0.226786 36:-0.609157
+1024 1:-0.856766 2:-0.857132 3:-0.824615 4:0.190783 5:-0.998795 6:-0.860559 7:-0.716921 8:0.175563 9:-0.995994 10:-0.924955 11:-0.679885 12:0.507868 13:-0.99612 14:-0.942538 15:-0.682692 16:0.509236 17:-0.995742 18:-0.943159 19:-0.496082 20:-0.612875 21:-0.470592 22:0.717323 23:-0.97881 24:-0.48979 25:-0.499408 26:0.0686539 27:-0.849897 28:-0.765253 29:-0.393632 30:0.244662 31:-0.871643 32:-0.852203 33:-0.385989 34:0.252185 35:-0.874836 36:-0.852572
+-1024 1:-0.786801 2:-0.494393 3:-0.601142 4:0.285056 5:-0.884558 6:-0.457663 7:-0.622091 8:-0.0330207 9:-0.79831 10:-0.618721 11:-0.519829 12:-0.215047 13:-0.766562 14:-0.82493 15:-0.534718 16:-0.226581 17:-0.753496 18:-0.824048 19:-0.25067 20:-0.26894 21:-0.202358 22:0.216909 23:-0.653109 24:-0.160608 25:-0.225273 26:0.00586327 27:-0.582723 28:-0.357305 29:-0.0343978 30:-0.483078 31:-0.440379 32:-0.729327 33:-0.0343978 34:-0.449228 35:-0.442665 36:-0.716543
+1024 1:-0.860285 2:-0.93769 3:-0.830273 4:-0.0167784 5:-0.999169 6:-0.95733 7:-0.818829 8:0.0606558 9:-0.998 10:-0.960997 11:-0.782018 12:0.338897 13:-0.997543 14:-0.973704 15:-0.783377 16:0.352871 17:-0.99757 18:-0.973963 19:-0.527305 20:-0.778051 21:-0.475298 22:0.458757 23:-0.993106 24:-0.82098 25:-0.480338 26:0.49195 27:-0.990117 28:-0.83435 29:-0.329937 30:0.616233 31:-0.983937 32:-0.913299 33:-0.335033 34:0.599747 35:-0.983041 36:-0.914388
+1024 1:-0.776451 2:-0.561374 3:-0.557297 4:0.199832 5:-0.932796 6:-0.666984 7:-0.596613 8:0.433019 9:-0.9289 10:-0.580908 11:-0.442086 12:0.254807 13:-0.905391 14:-0.840886 15:-0.446237 16:0.22365 17:-0.899456 18:-0.846089 19:-0.0439736 20:-0.221697 21:-0.096477 22:0.110794 23:-0.614176 24:-0.218801 25:-0.110853 26:0.353619 27:-0.668872 28:-0.125737 29:0.0828008 30:-0.202017 31:-0.521193 32:-0.669354 33:0.105728 34:-0.288986 35:-0.505697 36:-0.691972
+1024 1:-0.835238 2:-0.927982 3:-0.831687 4:-0.075682 5:-0.994993 6:-0.956727 7:-0.861292 8:0.0839446 9:-0.99708 10:-0.944399 11:-0.766774 12:0.276494 13:-0.994388 14:-0.975033 15:-0.772698 16:0.289055 17:-0.994276 18:-0.975169 19:-0.637835 20:-0.812881 21:-0.741177 22:0.180948 23:-0.961194 24:-0.859435 25:-0.799763 26:0.357356 27:-0.972813 28:-0.808603 29:-0.701913 30:0.23883 31:-0.943215 32:-0.937214 33:-0.709554 34:0.231813 35:-0.942014 36:-0.936791
+1024 1:-0.857801 2:-0.875561 3:-0.807642 4:0.0221907 5:-0.991831 6:-0.908468 7:-0.779199 8:0.0882639 9:-0.989719 10:-0.923458 11:-0.731714 12:0.250306 13:-0.984929 14:-0.961059 15:-0.736086 16:0.270643 17:-0.985141 18:-0.960579 19:-0.485466 20:-0.692069 21:-0.515298 22:0.354069 23:-0.948968 24:-0.707165 25:-0.518479 26:0.237422 27:-0.915651 28:-0.769876 29:-0.37325 30:0.137123 31:-0.889539 32:-0.904549 33:-0.365606 34:0.146021 35:-0.892387 36:-0.904147
+-1024 1:-0.791148 2:-0.687732 3:-0.618115 4:0.229777 5:-0.97355 6:-0.762323 7:-0.610767 8:0.391925 9:-0.97771 10:-0.766739 11:-0.500012 12:0.415871 13:-0.960078 14:-0.884584 15:-0.493529 16:0.441969 17:-0.961802 18:-0.884729 19:-0.103298 20:-0.387423 21:-0.169416 22:0.316393 23:-0.816669 24:-0.417136 25:-0.168062 26:0.33093 27:-0.806379 28:-0.457325 29:-0.0293022 30:0.073292 31:-0.715642 32:-0.743072 33:-0.0394934 34:0.0521912 35:-0.70762 36:-0.742982
+-1024 1:-0.742089 2:-0.51637 3:-0.553053 4:-0.0416631 5:-0.846128 6:-0.643953 7:-0.537168 8:0.289489 9:-0.895173 10:-0.603074 11:-0.489342 12:-0.222024 13:-0.784395 14:-0.842887 15:-0.49658 16:0.102409 17:-0.834336 18:-0.798901 19:-0.21008 20:-0.306855 21:-0.272946 22:0.0797327 23:-0.636936 24:-0.299089 25:-0.213354 26:0.125831 27:-0.665126 28:-0.375058 29:-0.0598758 30:-0.270029 31:-0.55195 32:-0.732069 33:-0.0853537 34:-0.152647 35:-0.568842 36:-0.697764
+-1024 1:-0.844553 2:-0.668949 3:-0.752482 4:0.0881609 5:-0.92067 6:-0.672907 7:-0.738153 8:0.00572403 9:-0.899022 10:-0.76391 11:-0.702751 12:-0.0224084 13:-0.870834 14:-0.881187 15:-0.7071 16:-0.0253418 17:-0.86741 18:-0.884414 19:-0.594747 20:-0.510588 21:-0.578826 22:0.319077 23:-0.831373 24:-0.398599 25:-0.582839 26:0.0156344 27:-0.720297 28:-0.596416 29:-0.503187 30:-0.172534 31:-0.666011 32:-0.816505 33:-0.510828 34:-0.23008 35:-0.647933 36:-0.824684
+-1024 1:-0.814125 2:-0.594577 3:-0.674691 4:0.083983 5:-0.914203 6:-0.679188 7:-0.670214 8:0.372685 9:-0.93235 10:-0.612353 11:-0.5869 12:0.136501 13:-0.890128 14:-0.85376 15:-0.589636 16:0.159347 17:-0.891538 18:-0.853681 19:-0.33747 20:-0.271121 21:-0.265887 22:-0.0311863 23:-0.607041 24:-0.376171 25:-0.327777 26:0.318799 27:-0.670194 28:-0.154019 29:-0.17962 30:-0.274427 31:-0.497671 32:-0.69581 33:-0.197454 34:-0.200687 35:-0.515375 36:-0.677856
+-1024 1:-0.799221 2:-0.958868 3:-0.804813 4:-0.0990647 5:-0.997956 6:-0.97617 7:-0.776369 8:0.00611669 9:-0.998494 10:-0.978733 11:-0.734762 12:0.281195 13:-0.998166 14:-0.98496 15:-0.731509 16:0.296326 17:-0.998264 18:-0.985452 19:-0.650948 20:-0.878642 21:-0.755295 22:0.332776 23:-0.997424 24:-0.8963 25:-0.716332 26:0.346152 27:-0.994314 28:-0.916119 29:-0.617836 30:0.50614 31:-0.993677 32:-0.954902 33:-0.620383 34:0.465367 35:-0.991718 36:-0.95749
+1024 1:-0.768585 2:-0.267349 3:-0.51345 4:0.15238 5:-0.719897 6:-0.232169 7:-0.521598 8:0.305493 9:-0.752749 10:-0.276052 11:-0.449708 12:-0.22815 13:-0.606374 14:-0.661067 15:-0.452339 16:-0.302392 17:-0.581167 18:-0.676977 19:-0.203835 20:0.0515238 21:-0.0258884 22:-0.00109449 23:-0.302123 24:0.216346 25:-0.0417225 26:-0.0818862 27:-0.316087 28:-0.00485154 29:0.0522273 30:-0.419609 31:-0.107837 32:-0.417112 33:0.0547751 34:-0.457394 35:-0.0861458 36:-0.42062
+-1024 1:-0.812883 2:-0.918922 3:-0.772283 4:-0.0290383 5:-0.997068 6:-0.954723 7:-0.738153 8:0.0634857 9:-0.996387 10:-0.958566 11:-0.669216 12:0.340157 13:-0.996464 14:-0.97367 15:-0.668964 16:0.350299 17:-0.99645 18:-0.974358 19:-0.536672 20:-0.805233 21:-0.588238 22:0.335913 23:-0.987094 24:-0.855309 25:-0.616212 26:0.321974 27:-0.973225 28:-0.85683 29:-0.503187 30:0.388378 31:-0.968897 32:-0.935543 33:-0.500639 34:0.349795 35:-0.966654 36:-0.940419
+1024 1:-0.841862 2:-0.765125 3:-0.725608 4:0.234368 5:-0.987167 6:-0.789923 7:-0.704183 8:0.315756 9:-0.986457 10:-0.823012 11:-0.5869 12:0.434022 13:-0.9782 14:-0.913651 15:-0.591161 16:0.449692 17:-0.978412 18:-0.913289 19:-0.461113 20:-0.53661 21:-0.437651 22:0.313192 23:-0.855547 24:-0.491241 25:-0.494641 26:0.37578 27:-0.868822 28:-0.535059 29:-0.327389 30:-0.0262163 31:-0.768038 32:-0.838909 33:-0.329937 34:0.00568347 35:-0.775525 36:-0.833925
+-1024 1:-0.827786 2:-0.51996 3:-0.671862 4:0.0707136 5:-0.863603 6:-0.575333 7:-0.673045 8:0.0147281 9:-0.82094 10:-0.620647 11:-0.637204 12:-0.0323136 13:-0.791532 14:-0.793426 15:-0.638453 16:-0.141157 17:-0.767249 18:-0.815311 19:-0.437383 20:-0.285133 21:-0.298829 22:-0.102816 23:-0.587807 24:-0.426929 25:-0.306322 26:-0.190903 27:-0.507482 28:-0.453594 29:-0.268793 30:-0.113577 31:-0.492932 32:-0.619554 33:-0.263697 34:-0.227331 35:-0.465075 36:-0.651297
+1024 1:-0.836066 2:-0.666324 3:-0.727022 4:0.131186 5:-0.947819 6:-0.719534 7:-0.738153 8:0.058503 9:-0.894228 10:-0.721235 11:-0.672265 12:0.149942 13:-0.904506 14:-0.868177 15:-0.684217 16:-0.0134398 17:-0.869375 18:-0.881955 19:-0.283766 20:-0.34793 21:-0.280005 22:-0.0153522 23:-0.664271 24:-0.45897 25:-0.315858 26:0.173035 27:-0.666332 28:-0.324162 29:-0.166881 30:-0.120335 31:-0.61374 32:-0.728816 33:-0.194906 34:-0.0484747 35:-0.621694 36:-0.705553
+-1024 1:-0.85511 2:-0.891301 3:-0.852903 4:-0.00336297 5:-0.993553 6:-0.919792 7:-0.849968 8:0.0912359 9:-0.990987 10:-0.917496 11:-0.809455 12:0.254494 13:-0.985712 14:-0.959508 15:-0.807784 16:0.309151 17:-0.988557 18:-0.957444 19:-0.660939 20:-0.76575 21:-0.752942 22:0.151591 23:-0.935566 24:-0.817898 25:-0.775925 26:0.215143 27:-0.92391 28:-0.792236 29:-0.737579 30:0.19318 31:-0.902833 32:-0.9066 33:-0.742675 34:0.103536 35:-0.88307 36:-0.914984
+-1024 1:-0.767343 2:-0.394934 3:-0.540324 4:0.227149 5:-0.826508 6:-0.384176 7:-0.56406 8:-0.107233 9:-0.721202 10:-0.547524 11:-0.477147 12:-0.337941 13:-0.670329 14:-0.771245 15:-0.498105 16:-0.314328 17:-0.656445 18:-0.759551 19:-0.178857 20:-0.196129 21:-0.131771 22:0.0976227 23:-0.554553 24:-0.118634 25:-0.137075 26:-0.0970684 27:-0.494264 28:-0.318772 29:0.0394883 30:-0.47416 31:-0.37524 32:-0.669789 33:0.0292971 34:-0.423455 35:-0.3796 36:-0.650802
+-1024 1:-0.576087 2:-0.152201 3:-0.285733 4:0.269932 5:-0.698819 6:-0.113649 7:-0.285228 8:0.110733 9:-0.600018 10:-0.18536 11:-0.115875 12:-0.68683 13:-0.469771 14:-0.689124 15:-0.109099 16:-0.410261 17:-0.527849 18:-0.645332 19:0.42619 20:0.0707451 21:0.308227 22:0.0337535 23:-0.355737 24:0.17258 25:0.218107 26:0.114515 27:-0.290832 28:0.310245 29:0.650952 30:-0.988811 31:-0.0488768 32:-0.613297 33:0.628022 34:-0.769146 35:-0.102819 36:-0.554995
+1024 1:-0.815781 2:-0.744198 3:-0.676105 4:0.320662 5:-0.987678 6:-0.752615 7:-0.627752 8:0.0288436 9:-0.960295 10:-0.882332 11:-0.553365 12:0.224779 13:-0.956615 14:-0.923293 15:-0.551499 16:0.266789 17:-0.959216 18:-0.921496 19:-0.173237 20:-0.480953 21:-0.169416 22:0.631047 23:-0.907617 24:-0.336556 25:-0.196667 26:-0.0322718 27:-0.784081 28:-0.721629 29:0.0318449 30:-0.21352 31:-0.735183 32:-0.858475 33:0.0496795 34:-0.12689 35:-0.757626 36:-0.850168
+1024 1:-0.847451 2:-0.889186 3:-0.772283 4:0.0735718 5:-0.997417 6:-0.919292 7:-0.766461 8:0.196164 9:-0.998917 10:-0.930494 11:-0.710372 12:0.485016 13:-0.997808 14:-0.952626 15:-0.713203 16:0.495574 17:-0.997774 18:-0.952755 19:-0.572891 20:-0.684102 21:-0.496475 22:0.60993 23:-0.986045 24:-0.643762 25:-0.494641 26:0.559983 27:-0.98169 28:-0.740293 29:-0.327389 30:0.627632 31:-0.969298 32:-0.87638 33:-0.324842 34:0.62739 35:-0.969455 36:-0.876169
+1024 1:-0.890712 2:-0.848422 3:-0.88402 4:0.0214914 5:-0.986103 6:-0.869678 7:-0.906585 8:0.295128 9:-0.995321 10:-0.808075 11:-0.778969 12:0.32808 13:-0.986585 14:-0.947081 15:-0.783377 16:0.307433 17:-0.984508 18:-0.950371 19:-0.767715 20:-0.704872 21:-0.825884 22:-0.0131738 23:-0.79253 24:-0.706121 25:-0.864126 26:0.274528 27:-0.856764 28:-0.559665 29:-0.770701 30:-0.330202 31:-0.691253 32:-0.921997 33:-0.78344 34:-0.405074 35:-0.6579 36:-0.928153
+-1024 1:-0.908099 2:-0.78024 3:-0.840174 4:-0.0796349 5:-0.94302 6:-0.828632 7:-0.845722 8:0.0356068 9:-0.953493 10:-0.843889 11:-0.794213 12:0.061977 13:-0.933304 14:-0.930265 15:-0.801682 16:0.0713346 17:-0.931872 18:-0.931073 19:-0.726503 20:-0.632007 21:-0.710589 22:0.0334872 23:-0.79406 24:-0.650237 25:-0.742552 26:0.0817345 27:-0.818403 28:-0.705794 29:-0.661148 30:-0.0770135 31:-0.760573 32:-0.868847 33:-0.666244 34:-0.0966834 35:-0.75889 36:-0.873949
+-1024 1:-0.686821 2:-0.318088 3:-0.454046 4:0.19819 5:-0.77991 6:-0.325739 7:-0.439506 8:0.0329259 9:-0.740353 10:-0.482974 11:-0.375015 12:-0.390248 13:-0.629132 14:-0.743331 15:-0.388268 16:-0.233476 17:-0.652183 18:-0.715899 19:0.0740374 20:-0.0946474 21:0.0258766 22:0.119821 23:-0.492923 24:0.0291289 25:0.0726977 26:-0.195685 27:-0.409451 28:-0.278191 29:0.212736 30:-0.486936 31:-0.301259 32:-0.604397 33:0.174519 34:-0.353512 35:-0.321325 36:-0.559039
+-1024 1:-0.851591 2:-0.868526 3:-0.783599 4:0.0916941 5:-0.99597 6:-0.900529 7:-0.76363 8:0.185061 9:-0.995284 10:-0.911974 11:-0.701226 12:0.454979 13:-0.993949 14:-0.944153 15:-0.702524 16:0.451538 17:-0.993358 18:-0.945697 19:-0.605363 20:-0.705543 21:-0.55059 22:0.470154 23:-0.973851 24:-0.693411 25:-0.573304 26:0.411342 27:-0.954526 28:-0.736458 29:-0.498092 30:0.487001 31:-0.938358 32:-0.857333 33:-0.515924 34:0.381824 35:-0.924257 36:-0.870568
+-1024 1:-0.768171 2:-0.430184 3:-0.578511 4:0.0877534 5:-0.836339 6:-0.520942 7:-0.555568 8:0.238971 9:-0.829016 10:-0.486681 11:-0.500012 12:-0.0105395 13:-0.775579 14:-0.762298 15:-0.504207 16:-0.128366 17:-0.741432 18:-0.77541 19:-0.0845635 20:-0.114279 21:-5.88239e-06 22:-0.188394 23:-0.455151 24:-0.290019 25:-0.0512578 26:0.060647 27:-0.481312 28:-0.111521 29:0.0904442 30:-0.290973 31:-0.362612 32:-0.568965 33:0.0165581 34:-0.103157 35:-0.387853 36:-0.497597
+-1024 1:-0.847037 2:-0.424094 3:-0.626601 4:0.287057 5:-0.85981 6:-0.381617 7:-0.620675 8:-0.20353 9:-0.739385 10:-0.642526 11:-0.600619 12:-0.364538 13:-0.64696 14:-0.772041 15:-0.620147 16:-0.0546053 17:-0.707873 18:-0.715126 19:-0.518563 20:-0.268991 21:-0.362356 22:0.300458 23:-0.685213 24:-0.102752 25:-0.334928 26:-0.357266 27:-0.467508 28:-0.566798 29:-0.289175 30:-0.34322 31:-0.411675 32:-0.666334 33:-0.319746 34:-0.149286 35:-0.460405 36:-0.611333
+-1024 1:-0.735259 2:-0.302569 3:-0.46536 4:0.0645228 5:-0.77201 6:-0.427064 7:-0.463568 8:0.294864 9:-0.776323 10:-0.346688 11:-0.43599 12:-0.134085 13:-0.654373 14:-0.671643 15:-0.440137 16:-0.107656 17:-0.656651 18:-0.67144 19:-0.177608 20:-0.0463199 21:-0.0164766 22:-0.171089 23:-0.430414 24:-0.231716 25:-0.113237 26:0.123112 27:-0.457646 28:0.0101215 29:0.0394883 30:-0.239732 31:-0.311737 32:-0.498904 33:0.0242015 34:-0.175317 35:-0.329501 36:-0.48095
+1024 1:-0.859871 2:-0.818285 3:-0.831687 4:0.192334 5:-0.989865 6:-0.79273 7:-0.726829 8:0.161448 9:-0.990084 10:-0.903239 11:-0.667692 12:0.368804 13:-0.984722 14:-0.939362 15:-0.67354 16:0.404349 17:-0.98564 18:-0.936574 19:-0.635337 20:-0.615668 21:-0.609414 22:0.633343 23:-0.954975 24:-0.397891 25:-0.666272 26:-0.0880445 27:-0.773594 28:-0.778214 29:-0.584714 30:-0.196934 31:-0.737592 32:-0.886847 33:-0.569427 34:-0.0659766 35:-0.778802 36:-0.872264
+-1024 1:-0.752232 2:-0.249411 3:-0.430001 4:0.897773 5:-0.893395 6:0.066492 7:-0.421106 8:-0.523433 9:-0.569742 10:-0.613902 11:-0.379588 12:-0.55437 13:-0.530847 14:-0.715728 15:-0.383692 16:-0.415977 17:-0.554162 18:-0.692314 19:-0.209456 20:0.01152 21:0.0164648 22:0.720796 23:-0.653877 24:0.556426 25:0.0417082 26:-0.859636 27:-0.0688177 28:-0.510791 29:0.113372 30:-0.738023 31:-0.0543853 32:-0.543331 33:0.121015 34:-0.63704 35:-0.100198 36:-0.525452
+1024 1:-0.871669 2:-0.811017 3:-0.811885 4:0.131892 5:-0.990104 6:-0.84125 7:-0.807506 8:0.170208 9:-0.979752 10:-0.849004 11:-0.724092 12:0.353555 13:-0.979183 14:-0.928068 15:-0.739137 16:0.322787 17:-0.974775 18:-0.929681 19:-0.473602 20:-0.56066 21:-0.477651 22:0.114724 23:-0.833954 24:-0.644259 25:-0.508943 26:0.187785 27:-0.808738 28:-0.585706 29:-0.380893 30:0.125499 31:-0.809292 32:-0.826107 33:-0.414014 34:0.0699268 35:-0.786479 36:-0.823624
+-1024 1:-0.583952 2:-0.122859 3:-0.247544 4:0.216483 5:-0.719816 6:-0.21441 7:-0.254089 8:0.659428 9:-0.775437 10:-0.0730757 11:-0.147887 12:-0.104847 13:-0.582208 14:-0.566766 15:-0.150288 16:-0.241917 17:-0.553117 18:-0.603095 19:0.400591 20:0.227385 21:0.395286 22:-0.147463 23:-0.239283 24:0.135786 25:0.308692 26:0.214963 27:-0.303893 28:0.425929 29:0.510823 30:-0.303322 31:-0.0422968 32:-0.257437 33:0.533753 34:-0.487908 35:0.00259713 36:-0.326321
+-1024 1:-0.63673 2:-0.0202525 3:-0.312607 4:0.365165 5:-0.660575 6:0.0739698 7:-0.305044 8:0.0234073 9:-0.539053 10:-0.155032 11:-0.22258 12:-0.310832 13:-0.440642 14:-0.508163 15:-0.221987 16:-0.356494 17:-0.423917 18:-0.523236 19:0.285079 20:0.295855 21:0.319991 22:0.0390475 23:-0.221826 24:0.430046 25:0.3516 26:-0.394597 27:-0.0475048 28:0.0523209 29:0.444583 30:-0.235471 31:0.0121719 32:-0.164679 33:0.454774 34:-0.226605 35:0.00573619 36:-0.166226
+1024 1:-0.823854 2:-0.721581 3:-0.657718 4:0.348119 5:-0.990871 6:-0.759213 7:-0.646152 8:0.427894 9:-0.989242 10:-0.792179 11:-0.503061 12:0.62335 13:-0.984414 14:-0.887712 15:-0.505733 16:0.630972 17:-0.984395 18:-0.888105 19:-0.261286 20:-0.367174 21:-0.247064 22:0.377956 23:-0.790049 24:-0.270673 25:-0.284868 26:0.464221 27:-0.811548 28:-0.318045 29:-0.0675192 30:0.0878239 31:-0.7116 32:-0.733661 33:-0.0547802 34:0.0944181 35:-0.717938 36:-0.736867
+931.7235305137201 1:-0.848693 2:-0.844789 3:-0.850074 4:0.187031 5:-0.995928 6:-0.826474 7:-0.728245 8:0.111661 9:-0.990816 10:-0.923941 11:-0.669216 12:0.396617 13:-0.990238 14:-0.946886 15:-0.675066 16:0.406348 17:-0.990055 18:-0.947011 19:-0.652197 20:-0.657741 21:-0.689415 22:0.602521 23:-0.967275 24:-0.462559 25:-0.723481 26:-0.213811 27:-0.743622 28:-0.843635 29:-0.645861 30:-0.138766 31:-0.774377 32:-0.89951 33:-0.650957 34:-0.158178 35:-0.768848 36:-0.901043
+-1024 1:-0.907271 2:-0.903991 3:-0.874119 4:-0.0515574 5:-0.991205 6:-0.929338 7:-0.876861 8:0.00885177 9:-0.988819 10:-0.939593 11:-0.830796 12:0.208843 13:-0.985855 14:-0.967993 15:-0.835244 16:0.227914 17:-0.986122 18:-0.967953 19:-0.73587 20:-0.800318 21:-0.781178 22:0.187264 23:-0.949079 24:-0.819268 25:-0.809298 26:0.239557 27:-0.954702 28:-0.844422 29:-0.747771 30:0.163609 31:-0.919956 32:-0.933511 33:-0.750318 34:0.191763 35:-0.927033 36:-0.931774
+1024 1:-0.854696 2:-0.781441 3:-0.789256 4:0.126461 5:-0.979678 6:-0.807425 7:-0.783446 8:0.152159 9:-0.971039 10:-0.836416 11:-0.704275 12:0.2913 13:-0.967655 14:-0.922155 15:-0.7071 16:0.292772 17:-0.966441 18:-0.923388 19:-0.528555 20:-0.567545 21:-0.588238 22:0.20966 23:-0.83653 24:-0.544173 25:-0.609061 26:0.0708565 27:-0.776948 28:-0.639382 29:-0.510828 30:-0.0308884 31:-0.757871 32:-0.839472 33:-0.503187 34:-0.105197 35:-0.742014 36:-0.851749
+-819.6306267514054 1:-0.677713 2:-0.259964 3:-0.427173 4:0.209532 5:-0.732124 6:-0.218141 7:-0.442337 8:0.102488 9:-0.660764 10:-0.286996 11:-0.268311 12:-0.683789 13:-0.545435 14:-0.764397 15:-0.27538 16:-0.445514 17:-0.591776 18:-0.730406 19:0.266972 20:-0.0124868 21:0.169405 22:0.0388558 23:-0.397808 24:0.100177 25:0.0941519 26:0.167394 27:-0.373107 28:0.228615 29:0.50318 30:-1 31:-0.110423 32:-0.669673 33:0.485348 34:-0.762399 35:-0.170515 36:-0.611355
+1024 1:-0.86132 2:-0.920714 3:-0.814714 4:0.0302606 5:-0.9995 6:-0.945703 7:-0.806091 8:0.137245 9:-0.999383 10:-0.948683 11:-0.760677 12:0.447046 13:-0.999357 14:-0.964911 15:-0.760494 16:0.457433 17:-0.999382 18:-0.965666 19:-0.514816 20:-0.73195 21:-0.46118 22:0.541081 23:-0.992518 24:-0.758712 25:-0.466035 26:0.537333 27:-0.984146 28:-0.77408 29:-0.31465 30:0.687419 31:-0.979256 32:-0.882445 33:-0.294271 34:0.682613 35:-0.980651 36:-0.887642
+-1024 1:-0.85304 2:-0.857401 3:-0.840174 4:0.0949355 5:-0.993527 6:-0.87187 7:-0.790523 8:0.170519 9:-0.993172 10:-0.90332 11:-0.730189 12:0.388076 13:-0.989865 14:-0.944618 15:-0.733035 16:0.390422 17:-0.989434 18:-0.945926 19:-0.622223 20:-0.674498 21:-0.658826 22:0.27125 23:-0.922284 24:-0.683849 25:-0.680575 26:0.0531234 27:-0.838273 28:-0.762922 29:-0.605097 30:0.20155 31:-0.875583 32:-0.872587 33:-0.605097 34:0.184975 35:-0.874608 36:-0.876315
+1024 1:-0.845588 2:-0.904146 3:-0.811885 4:-0.02393 5:-0.993724 6:-0.93511 7:-0.804675 8:0.156966 9:-0.998567 10:-0.935895 11:-0.746957 12:0.347621 13:-0.994153 14:-0.963308 15:-0.746764 16:0.36215 17:-0.994235 18:-0.963631 19:-0.532301 20:-0.726257 21:-0.524708 22:0.387005 23:-0.971304 24:-0.758686 25:-0.561385 26:0.379382 27:-0.9606 28:-0.782393 29:-0.434397 30:0.420052 31:-0.951139 32:-0.900967 33:-0.439492 34:0.387725 35:-0.94688 36:-0.902822
+1024 1:-0.860699 2:-0.575905 3:-0.688833 4:-0.0307046 5:-0.861865 6:-0.651127 7:-0.674461 8:-0.029744 9:-0.835033 10:-0.676265 11:-0.59757 12:-0.0718388 13:-0.827606 14:-0.845519 15:-0.600315 16:-0.0706425 17:-0.825084 18:-0.848853 19:-0.383055 20:-0.255481 21:-0.216475 22:-0.0726918 23:-0.558532 24:-0.33887 25:-0.218122 26:-0.134928 27:-0.491927 28:-0.362391 29:-0.123568 30:-0.211361 31:-0.506551 32:-0.670946 33:-0.128664 34:-0.330012 35:-0.469363 36:-0.693054
+1024 1:-0.755751 2:-0.33579 3:-0.471018 4:0.203566 5:-0.786753 6:-0.332272 7:-0.476305 8:0.000646535 9:-0.729168 10:-0.485127 11:-0.373491 12:-0.317375 13:-0.65928 14:-0.745621 15:-0.379115 16:-0.291234 17:-0.656735 18:-0.742135 19:-0.0239912 20:-0.134105 21:0.042345 22:0.194748 23:-0.578838 24:-0.0448905 25:0.0297892 26:-0.0856902 27:-0.490764 28:-0.291177 29:0.197449 30:-0.475586 31:-0.359106 32:-0.647009 33:0.177067 34:-0.408315 35:-0.364812 36:-0.622251
+1024 1:-0.882639 2:-0.417028 3:-0.669033 4:-0.0920043 5:-0.711526 6:-0.432901 7:-0.658891 8:-0.163993 9:-0.685141 10:-0.534274 11:-0.564034 12:-0.422014 13:-0.648002 14:-0.790111 15:-0.565227 16:-0.390439 17:-0.648796 18:-0.788156 19:-0.419274 20:-0.201394 21:-0.155299 22:0.00528072 23:-0.493614 24:-0.125827 25:-0.184748 26:0.0289286 27:-0.527456 28:-0.234473 29:0.0777053 30:-0.649252 31:-0.363495 32:-0.731922 33:0.0726097 34:-0.614363 35:-0.36661 36:-0.720073
+-1024 1:-0.729049 2:-0.307481 3:-0.431416 4:0.685857 5:-0.878458 6:-0.112405 7:-0.397044 8:-0.382386 9:-0.659067 10:-0.635323 11:-0.349101 12:-0.307148 13:-0.643694 14:-0.724866 15:-0.35013 16:-0.344224 17:-0.633574 18:-0.737442 19:-0.0964283 20:-0.0689645 21:0.0635216 22:0.695353 23:-0.726431 24:0.290426 25:0.0726977 26:-0.781056 27:-0.227566 28:-0.597347 29:0.151589 30:-0.523725 31:-0.262558 32:-0.594813 33:0.143945 34:-0.608857 35:-0.236638 36:-0.615384
+1024 1:-0.767757 2:-0.404794 3:-0.510621 4:0.162031 5:-0.801982 6:-0.395508 7:-0.490459 8:-0.00711188 9:-0.770124 10:-0.559889 11:-0.367393 12:-0.0997558 13:-0.745454 14:-0.756712 15:-0.371487 16:-0.102871 17:-0.742039 18:-0.761222 19:-0.0876862 20:-0.0128797 21:0.0611686 22:0.303483 23:-0.530313 24:0.208706 25:0.106071 26:-0.208046 27:-0.372895 28:-0.233601 29:0.225475 30:-0.192214 31:-0.34422 32:-0.494218 33:0.215284 34:-0.332181 35:-0.29993 36:-0.527159
+-1024 1:-0.888849 2:-0.532384 3:-0.705806 4:-0.120374 5:-0.811527 6:-0.627998 7:-0.701352 8:-0.0518142 9:-0.800527 10:-0.630898 11:-0.623485 12:-0.190241 13:-0.776757 14:-0.833648 15:-0.626249 16:-0.177375 17:-0.775824 18:-0.835727 19:-0.443003 20:-0.275705 21:-0.230593 22:0.016428 23:-0.602313 24:-0.31077 25:-0.225273 26:-0.037458 27:-0.566824 28:-0.377324 29:-0.103188 30:-0.219425 31:-0.540041 32:-0.702814 33:-0.0929971 34:-0.208144 35:-0.547669 36:-0.703516
+-1024 1:-0.649148 2:-0.387394 3:-0.376254 4:0.0808329 5:-0.837535 6:-0.554677 7:-0.38289 8:0.0574265 9:-0.811876 10:-0.596692 11:-0.368918 12:0.125926 13:-0.771483 14:-0.70338 15:-0.369962 16:0.014243 17:-0.749569 18:-0.726235 19:0.0290816 20:-0.238176 21:-0.0282413 22:0.137786 23:-0.695673 24:-0.350787 25:-0.0250359 26:0.0153816 27:-0.641402 28:-0.441386 29:0.0471317 30:0.167468 31:-0.612325 32:-0.588624 33:0.0496795 34:0.131526 35:-0.602098 36:-0.594925
+1024 1:-0.741882 2:-0.42731 3:-0.497891 4:-0.0632822 5:-0.805829 6:-0.589801 7:-0.510275 8:0.382265 9:-0.87021 10:-0.481459 11:-0.493915 12:-0.162332 13:-0.715708 14:-0.752672 15:-0.495054 16:0.107818 17:-0.762477 18:-0.70613 19:-0.155128 20:-0.246742 21:-0.157652 22:-0.0655496 23:-0.573847 24:-0.353773 25:-0.153761 26:0.228752 27:-0.673224 28:-0.279208 29:-0.0598758 30:-0.20213 31:-0.482297 32:-0.641297 33:-0.0471368 34:-0.133928 35:-0.505488 36:-0.62907
+1024 1:-0.783282 2:-0.282008 3:-0.463947 4:0.264835 5:-0.77096 6:-0.240052 7:-0.452245 8:0.00954908 9:-0.716063 10:-0.45747 11:-0.339955 12:-0.182464 13:-0.674802 14:-0.709165 15:-0.340977 16:-0.188765 17:-0.671121 18:-0.715708 19:-0.305622 20:-0.0654392 21:-0.0517709 22:0.220962 23:-0.521568 24:0.115376 25:-0.0298035 26:-0.360615 27:-0.301492 28:-0.308571 29:0.110824 30:-0.571735 31:-0.25013 32:-0.611111 33:0.100633 34:-0.599287 35:-0.234438 36:-0.612741
+1024 1:-0.66488 2:-0.19442 3:-0.356452 4:0.368279 5:-0.738206 6:-0.0867588 7:-0.344675 8:0.114504 9:-0.667034 10:-0.296942 11:-0.20124 12:-0.1206 13:-0.616948 14:-0.614981 15:-0.199105 16:-0.110902 17:-0.616978 18:-0.619032 19:0.217646 20:0.265297 21:0.308227 22:0.14373 23:-0.266801 24:0.504277 25:0.330146 26:-0.282339 27:-0.0889488 28:0.128401 29:0.436939 30:-0.351992 31:0.0147769 32:-0.237436 33:0.421653 34:-0.321128 35:0.0143847 36:-0.220411
+-1024 1:-0.857801 2:-0.470628 3:-0.653475 4:-0.222053 5:-0.764516 6:-0.634407 7:-0.657476 8:-0.117869 9:-0.778859 10:-0.644136 11:-0.637204 12:-0.291899 13:-0.679744 14:-0.778641 15:-0.639978 16:-0.308971 17:-0.673574 18:-0.788704 19:-0.558528 20:-0.385857 21:-0.418827 22:-0.173965 23:-0.616744 24:-0.559347 25:-0.4565 26:-0.17558 27:-0.617928 28:-0.613441 29:-0.411467 30:-0.128858 31:-0.573051 32:-0.710727 33:-0.414014 34:-0.147738 35:-0.576236 36:-0.721889
+1024 1:-0.80067 2:-0.355122 3:-0.52618 4:0.119699 5:-0.76485 6:-0.35359 7:-0.524429 8:-0.00535168 9:-0.730299 10:-0.488735 11:-0.413125 12:-0.314812 13:-0.667665 14:-0.755495 15:-0.415728 16:-0.277051 17:-0.669821 18:-0.75214 19:-0.179481 20:-0.145965 21:-0.0188295 22:0.180666 23:-0.566262 24:-0.0351424 25:-0.0226521 26:-0.0691034 27:-0.491935 28:-0.277898 29:0.169423 30:-0.55353 31:-0.350001 32:-0.675753 33:0.154137 34:-0.493212 35:-0.35826 36:-0.65667
+1024 1:-0.873118 2:-0.473545 3:-0.674691 4:-0.138794 5:-0.753881 6:-0.547875 7:-0.664553 8:-0.0814736 9:-0.728877 10:-0.537347 11:-0.574705 12:-0.273427 13:-0.708757 14:-0.792833 15:-0.572855 16:-0.302623 17:-0.698593 18:-0.80181 19:-0.395545 20:-0.127348 21:-0.105889 22:-0.216573 23:-0.379527 24:-0.213766 25:-0.153761 26:-0.056933 27:-0.371377 28:-0.0720907 29:0.0496795 30:-0.543074 31:-0.295564 32:-0.63803 33:0.0267493 34:-0.453299 35:-0.316928 36:-0.614141
+-1024 1:-0.795495 2:-0.506426 3:-0.589826 4:0.280306 5:-0.904043 6:-0.51828 7:-0.582459 8:-0.28739 9:-0.769529 10:-0.734567 11:-0.567083 12:0.25366 13:-0.835176 14:-0.735991 15:-0.562176 16:-0.401427 17:-0.720291 18:-0.852964 19:-0.304374 20:-0.319992 21:-0.230593 22:0.200824 23:-0.72983 24:-0.343774 25:-0.256263 26:-0.47602 27:-0.471415 28:-0.660751 29:-0.22548 30:0.0572295 31:-0.598953 32:-0.634384 33:-0.184715 34:-0.286995 35:-0.530546 36:-0.728502
+-1024 1:-0.937077 2:-0.574217 3:-0.801985 4:-0.171961 5:-0.797211 6:-0.639829 7:-0.807506 8:-0.110117 9:-0.809669 10:-0.683421 11:-0.753055 12:-0.243503 13:-0.76017 14:-0.849728 15:-0.755918 16:-0.249231 17:-0.755026 18:-0.855408 19:-0.800185 20:-0.443531 21:-0.567061 22:-0.278051 23:-0.526183 24:-0.567695 25:-0.618596 26:-0.193128 27:-0.60217 28:-0.634316 29:-0.58981 30:-0.330041 31:-0.502137 32:-0.768931 33:-0.594905 34:-0.369369 35:-0.49536 36:-0.781773
+1024 1:-0.741468 2:-0.142955 3:-0.381912 4:0.263029 5:-0.695674 6:-0.0959757 7:-0.398459 8:0.34918 9:-0.70168 10:-0.155204 11:-0.304896 12:-0.202527 13:-0.539937 14:-0.569132 15:-0.305891 16:-0.228144 17:-0.527161 18:-0.576769 19:-0.0539651 20:0.153929 21:0.17411 22:-0.0264356 23:-0.234521 24:0.306232 25:0.163282 26:-0.1048 27:-0.26765 28:0.0582932 29:0.233118 30:-0.458243 31:-0.0153769 32:-0.343818 33:0.220379 34:-0.495889 35:0.011786 36:-0.344448
+-1024 1:-0.831512 2:-0.438267 3:-0.606799 4:-0.0684027 5:-0.746532 6:-0.479105 7:-0.602275 8:-0.0401156 9:-0.736033 10:-0.520167 11:-0.484768 12:-0.169648 13:-0.722297 14:-0.761893 15:-0.48285 16:-0.220386 17:-0.710735 18:-0.776341 19:-0.233185 20:-0.0472178 21:-0.044712 22:-0.0555687 23:-0.358326 24:0.0366081 25:-0.0321873 26:-0.14539 27:-0.321852 28:-0.0929015 29:0.11592 30:-0.1588 31:-0.339125 32:-0.476188 33:0.171971 34:-0.508634 35:-0.25261 36:-0.57794
+1024 1:-0.61831 2:-0.201216 3:-0.314022 4:0.836887 5:-0.877599 6:0.000969969 7:-0.249843 8:-0.495811 9:-0.59055 10:-0.608133 11:-0.228677 12:-0.34888 13:-0.570414 14:-0.660357 15:-0.241819 16:-0.410449 17:-0.552241 18:-0.677188 19:0.0996372 20:-0.0382921 21:0.13411 22:0.749662 23:-0.727701 24:0.353994 25:0.163282 26:-0.78218 27:-0.2321 28:-0.589562 29:0.273883 30:-0.512649 31:-0.250528 32:-0.571259 33:0.261144 34:-0.567613 35:-0.233505 36:-0.584789
+-1024 1:-0.884502 2:-0.689055 3:-0.749653 4:-0.0939503 5:-0.913532 6:-0.789606 7:-0.74523 8:-0.0070848 9:-0.903771 10:-0.779844 11:-0.702751 12:0.00854976 13:-0.889799 14:-0.893013 15:-0.702524 16:-0.00190533 17:-0.887378 18:-0.898767 19:-0.482969 20:-0.471012 21:-0.343533 22:-0.0276446 23:-0.746745 24:-0.616189 25:-0.384986 26:0.0676874 27:-0.732006 28:-0.557563 29:-0.317198 30:-0.0147132 31:-0.713614 32:-0.78393 33:-0.299366 34:-0.183249 35:-0.67997 36:-0.817109
+1024 1:-0.693031 2:-0.279576 3:-0.330994 4:0.510338 5:-0.806073 6:-0.115132 7:-0.285228 8:-0.394965 9:-0.644012 10:-0.617514 11:-0.175326 12:-0.517234 13:-0.611104 14:-0.758011 15:-0.188426 16:-0.38627 17:-0.628296 18:-0.738144 19:0.135852 20:-0.045348 21:0.24705 22:0.728635 23:-0.685014 24:0.427853 25:0.301541 26:-0.567326 27:-0.31445 28:-0.486165 29:0.50318 30:-0.882923 31:-0.214365 32:-0.692055 33:0.4828 34:-0.710619 35:-0.257974 36:-0.653757
+1024 1:-0.851798 2:-0.486307 3:-0.639331 4:-0.0284119 5:-0.78827 6:-0.515592 7:-0.620675 8:-0.115574 9:-0.766688 10:-0.624041 11:-0.519829 12:-0.200346 13:-0.750312 14:-0.803796 15:-0.52709 16:-0.18791 17:-0.7492 18:-0.806143 19:-0.334347 20:-0.120264 21:-0.134124 22:0.13508 23:-0.48664 24:0.0664749 25:-0.084631 26:-0.220902 27:-0.384843 28:-0.281982 29:0.0853486 30:-0.353233 31:-0.374682 32:-0.61036 33:0.0878964 34:-0.455475 35:-0.352294 36:-0.639333
+1024 1:-0.746229 2:-0.308545 3:-0.459704 4:-0.0570002 5:-0.69172 6:-0.3858 7:-0.462152 8:0.273247 9:-0.73619 10:-0.281894 11:-0.312518 12:-0.171543 13:-0.662534 14:-0.690487 15:-0.315044 16:-0.182818 17:-0.657022 18:-0.697187 19:0.0365741 20:0.146092 21:0.178816 22:-0.091615 23:-0.242519 24:0.197598 25:0.139444 26:0.17998 27:-0.284832 28:0.415604 29:0.385984 30:-0.446716 31:-0.0996721 32:-0.401107 33:0.37834 34:-0.574748 35:-0.0519409 36:-0.432059
+-1024 1:-0.883053 2:-0.435332 3:-0.676105 4:0.000522993 5:-0.768773 6:-0.445823 7:-0.658891 8:-0.127747 9:-0.728663 10:-0.572853 11:-0.568607 12:-0.29203 13:-0.704962 14:-0.795405 15:-0.571329 16:-0.268182 17:-0.706599 18:-0.796633 19:-0.49858 20:-0.171719 21:-0.25177 22:0.107827 23:-0.500344 24:0.00293648 25:-0.199051 26:-0.335791 27:-0.332933 28:-0.348145 29:-0.0522324 30:-0.611078 31:-0.314233 32:-0.691946 33:-0.0726148 34:-0.617499 35:-0.302445 36:-0.687767
+-1024 1:-0.716215 2:-0.452436 3:-0.475261 4:-0.0371812 5:-0.816365 6:-0.591374 7:-0.457906 8:0.257318 9:-0.85964 10:-0.555074 11:-0.384161 12:-0.160447 13:-0.772007 14:-0.806529 15:-0.39437 16:0.0692669 17:-0.803691 18:-0.77167 19:-0.0502186 20:-0.237081 21:-0.124712 22:0.0844889 23:-0.617088 24:-0.256081 25:-0.0345711 26:0.109902 27:-0.64079 28:-0.346173 29:0.0955373 30:-0.247739 31:-0.526099 32:-0.692922 33:0.0547751 34:-0.0966351 35:-0.549869 36:-0.649477
+1024 1:-0.749955 2:-0.292569 3:-0.435659 4:0.178912 5:-0.736243 6:-0.255021 7:-0.423936 8:-0.0177679 9:-0.695949 10:-0.445247 11:-0.283554 12:-0.333328 13:-0.636953 14:-0.724112 15:-0.292161 16:-0.269737 17:-0.641402 18:-0.713594 19:-0.115787 20:-0.0459857 21:0.042345 22:0.208382 23:-0.500635 24:0.136303 25:0.0655463 26:-0.127584 27:-0.405469 28:-0.198076 29:0.210188 30:-0.533706 31:-0.256088 32:-0.590183 33:0.194901 34:-0.456298 35:-0.270531 36:-0.565449
+-1024 1:-0.909962 2:-0.525804 3:-0.749653 4:-0.211739 5:-0.77148 6:-0.635591 7:-0.756553 8:-0.0238676 9:-0.81545 10:-0.630893 11:-0.722567 12:-0.286269 13:-0.707974 14:-0.814061 15:-0.726932 16:-0.312039 17:-0.69598 18:-0.822651 19:-0.712141 20:-0.442113 21:-0.524708 22:-0.26725 23:-0.564925 24:-0.598736 25:-0.578071 26:-0.156712 27:-0.628223 28:-0.627004 29:-0.551593 30:-0.181154 31:-0.563756 32:-0.741108 33:-0.556688 34:-0.206741 35:-0.562412 36:-0.751796
+1024 1:-0.805224 2:-0.307753 3:-0.527594 4:0.131794 5:-0.720869 6:-0.252034 7:-0.513106 8:-0.0915676 9:-0.667222 10:-0.451414 11:-0.384161 12:-0.368031 13:-0.627723 14:-0.734421 15:-0.386743 16:-0.296343 17:-0.637936 18:-0.725976 19:-0.186975 20:-0.0846962 21:-0.0094177 22:0.14234 23:-0.478865 24:0.0889076 25:0.0178726 26:-0.090528 27:-0.429048 28:-0.196995 29:0.194901 30:-0.596876 31:-0.277698 32:-0.637636 33:0.171971 34:-0.467221 35:-0.304806 36:-0.600568
+-1024 1:-0.704209 2:-0.202848 3:-0.36211 4:0.0770199 5:-0.660303 6:-0.214432 7:-0.355997 8:0.220191 9:-0.668147 10:-0.209302 11:-0.172277 12:-0.460705 13:-0.561793 14:-0.692067 15:-0.182324 16:-0.342609 17:-0.573164 18:-0.667242 19:0.105256 20:0.053034 21:0.221167 22:-0.0116027 23:-0.341028 24:0.136152 25:0.156131 26:0.151037 27:-0.33119 28:0.285324 29:0.52611 30:-0.989609 31:-0.0637468 32:-0.632393 33:0.487893 34:-0.667562 35:-0.142807 36:-0.546207
+-1024 1:-0.831305 2:-0.401933 3:-0.603971 4:-0.149023 5:-0.690415 6:-0.456394 7:-0.593782 8:0.106936 9:-0.763038 10:-0.452775 11:-0.486293 12:-0.305497 13:-0.679846 14:-0.76927 15:-0.484375 16:-0.368284 17:-0.665289 18:-0.786508 19:-0.206957 20:-0.0971498 21:-0.0258884 22:-0.0384776 23:-0.412652 24:-0.035037 25:-0.0560254 26:-0.105081 27:-0.372517 28:-0.126942 29:0.121015 30:-0.584165 31:-0.247364 32:-0.614026 33:0.136302 34:-0.754893 35:-0.208418 36:-0.661582
+995.0856589003339 1:-0.556218 2:-0.142732 3:-0.19097 4:0.477438 5:-0.76781 6:-0.0830328 7:-0.203136 8:0.514002 9:-0.762634 10:-0.178576 11:-0.0457555 12:0.0852282 13:-0.653287 14:-0.568336 15:-0.0557064 16:0.074513 17:-0.647774 18:-0.573109 19:0.473643 20:0.263244 21:0.491755 22:0.302423 23:-0.366062 24:0.537928 25:0.470788 26:0.100805 27:-0.292181 28:0.290007 29:0.554136 30:-0.393324 31:0.018646 32:-0.253152 33:0.577066 34:-0.418384 35:0.0191673 36:-0.26792
+1024 1:-0.848693 2:-0.427871 3:-0.596898 4:-0.164652 5:-0.727544 6:-0.530898 7:-0.616429 8:-0.00944076 9:-0.720082 10:-0.468046 11:-0.478671 12:-0.333701 13:-0.685659 14:-0.784352 15:-0.484375 16:-0.309706 17:-0.686262 18:-0.784983 19:-0.30812 20:-0.12157 21:-0.112948 22:0.0819697 23:-0.49602 24:-0.0269203 25:-0.0893986 26:0.0374299 27:-0.420615 28:-0.0331837 29:0.143945 30:-0.524933 31:-0.3344 32:-0.652898 33:0.16178 34:-0.622884 35:-0.316394 36:-0.680379
+-1024 1:-0.827373 2:-0.367392 3:-0.557297 4:0.128748 5:-0.773774 6:-0.357132 7:-0.548491 8:-0.0637226 9:-0.727659 10:-0.526561 11:-0.44361 12:-0.258813 13:-0.693008 14:-0.762167 15:-0.443186 16:-0.241499 17:-0.693449 18:-0.764225 19:-0.341841 20:-0.113394 21:-0.0894182 22:0.149391 23:-0.507958 24:0.0429479 25:-0.0560254 26:-0.342168 27:-0.330085 28:-0.331404 29:0.0878964 30:-0.601572 31:-0.281952 32:-0.651279 33:0.0828008 34:-0.621521 35:-0.276246 36:-0.656146
+1024 1:-0.728842 2:-0.417729 3:-0.47809 4:0.109865 5:-0.8245 6:-0.493642 7:-0.47489 8:0.219737 9:-0.827237 10:-0.509728 11:-0.355198 12:-0.00782859 13:-0.776152 14:-0.758972 15:-0.356232 16:0.000538278 17:-0.77502 18:-0.761169 19:0.0197158 20:-0.0962494 21:0.0823452 22:0.0775331 23:-0.569797 24:-0.174678 25:0.0679301 26:0.0811951 27:-0.551918 28:-0.213413 29:0.16178 30:-0.0737189 31:-0.438714 32:-0.526341 33:0.151589 34:-0.12951 35:-0.421264 36:-0.538489
+-1024 1:-0.873739 2:-0.515487 3:-0.670448 4:-0.193234 5:-0.768276 6:-0.617272 7:-0.674461 8:0.0166034 9:-0.792299 10:-0.567747 11:-0.567083 12:-0.270212 13:-0.748001 14:-0.82838 15:-0.568278 16:-0.243096 17:-0.749008 18:-0.827749 19:-0.474226 20:-0.15724 21:-0.178828 22:-0.218208 23:-0.391538 24:-0.241821 25:-0.265798 26:0.0554327 27:-0.4453 28:-0.0589208 29:-0.0369456 30:-0.533488 31:-0.331223 32:-0.668505 33:-0.0598758 34:-0.508634 35:-0.331935 36:-0.659321
+-1024 1:-0.776865 2:-0.366148 3:-0.510621 4:0.0832593 5:-0.794307 6:-0.449045 7:-0.507444 8:0.285318 9:-0.806998 10:-0.411906 11:-0.40093 12:-0.0725687 13:-0.73497 14:-0.735987 15:-0.406575 16:-0.0465824 17:-0.735209 18:-0.733755 19:-0.165743 20:-0.029698 21:0.0682275 22:-0.159852 23:-0.41095 24:-0.181364 25:-0.0345711 26:0.135715 27:-0.463148 28:0.0160639 29:0.11592 30:-0.314059 31:-0.292452 32:-0.516697 33:0.103181 34:-0.27928 35:-0.301284 36:-0.507055
+-1024 1:-0.488745 2:-0.0364925 3:-0.0891338 4:0.513318 5:-0.742994 6:-0.00577709 7:-0.0927354 8:0.589433 9:-0.734866 10:-0.0671523 11:-0.0244143 12:-0.0456421 13:-0.532806 14:-0.478803 15:-0.0267213 16:-0.0657724 17:-0.526894 18:-0.489803 19:0.509858 20:0.270687 21:0.451754 22:0.157876 23:-0.291253 24:0.474586 25:0.458869 26:0.00829061 27:-0.238727 28:0.264334 29:0.584709 30:-0.372397 31:0.0287967 32:-0.229264 33:0.599996 34:-0.382671 35:0.0235864 36:-0.241028
+-1024 1:-0.588712 2:-0.233308 3:-0.271589 4:0.20699 5:-0.759458 6:-0.302524 7:-0.273905 8:0.465102 9:-0.783188 10:-0.254836 11:-0.144838 12:-0.040594 13:-0.676118 14:-0.649518 15:-0.139609 16:-0.0330314 17:-0.677678 18:-0.655532 19:0.40184 20:0.138434 21:0.357639 22:0.00834849 23:-0.3303 24:0.185244 25:0.349217 26:0.25425 27:-0.420272 28:0.248141 29:0.495536 30:-0.343776 31:-0.14988 32:-0.382859 33:0.505727 34:-0.422882 35:-0.124143 36:-0.403643
+-1024 1:-0.76134 2:-0.308928 3:-0.473847 4:0.104714 5:-0.767574 6:-0.3818 7:-0.467813 8:0.1902 9:-0.741988 10:-0.362241 11:-0.361296 12:-0.128959 13:-0.683653 14:-0.698814 15:-0.365385 16:-0.167379 17:-0.665466 18:-0.702398 19:-0.0845635 20:-0.00169382 21:0.129404 22:-0.187936 23:-0.385931 24:-0.170863 25:0.0917681 26:-0.00141314 27:-0.390792 28:-0.0239837 29:0.202545 30:-0.37727 31:-0.241121 32:-0.49922 33:0.141398 34:-0.146561 35:-0.283006 36:-0.413992
+-1024 1:-0.626588 2:-0.175004 3:-0.346553 4:0.0645532 5:-0.682288 6:-0.269109 7:-0.334767 8:0.072747 9:-0.654584 10:-0.309281 11:-0.278981 12:-0.20065 13:-0.561478 14:-0.59069 15:-0.278431 16:-0.383014 17:-0.518571 18:-0.635401 19:0.197665 20:0.0107318 21:0.155287 22:0.0264089 23:-0.44969 24:-0.0128 25:0.103687 26:-0.0757786 27:-0.358385 28:-0.0590603 29:0.26624 30:-0.264269 31:-0.246999 32:-0.439757 33:0.312097 34:-0.447615 35:-0.220408 36:-0.511463
+-1024 1:-0.557667 2:-0.0991959 3:-0.207942 4:0.099758 5:-0.679985 6:-0.245915 7:-0.208797 8:0.361677 9:-0.705944 10:-0.182094 11:-0.166179 12:-0.0855149 13:-0.547301 14:-0.517733 15:-0.17012 16:-0.0295625 17:-0.553039 18:-0.506585 19:0.227636 20:0.141855 21:0.254109 22:-0.103156 23:-0.346429 24:0.00602355 25:0.230026 26:0.061119 27:-0.363448 28:0.105952 29:0.342671 30:-0.0770216 31:-0.202876 32:-0.28356 33:0.345219 34:-0.049297 35:-0.216469 36:-0.281374
+1024 1:-0.543592 2:-0.132826 3:-0.171168 4:0.5009 5:-0.772603 6:-0.075726 7:-0.18332 8:0.540723 9:-0.76556 10:-0.166079 11:-0.0335606 12:0.0850023 13:-0.646162 14:-0.559359 15:-0.0435021 16:0.076837 17:-0.642894 18:-0.565897 19:0.488629 20:0.26253 21:0.484696 22:0.293742 23:-0.363139 24:0.531272 25:0.475556 26:0.0912248 27:-0.286458 28:0.287905 29:0.564327 30:-0.355875 31:0.00639393 32:-0.24109 33:0.589805 34:-0.419182 35:0.0175206 36:-0.269261
+1024 1:-0.621414 2:-0.190259 3:-0.295634 4:0.182367 5:-0.715957 6:-0.231827 7:-0.292305 8:0.521564 9:-0.756917 10:-0.143108 11:-0.134168 12:0.00292812 13:-0.649678 14:-0.599749 15:-0.135033 16:-0.0270933 17:-0.643124 18:-0.612143 19:0.342523 20:0.238275 21:0.392933 22:-0.0423336 23:-0.263579 24:0.236711 25:0.33253 26:0.284125 27:-0.319928 28:0.492934 29:0.508275 30:-0.274806 31:-0.0469551 32:-0.244443 33:0.518466 34:-0.440432 35:0.00233018 36:-0.299538
+-1024 1:-0.644802 2:-0.19281 3:-0.338066 4:0.194961 5:-0.720849 6:-0.224059 7:-0.320613 8:0.07731 9:-0.692056 10:-0.372445 11:-0.260689 12:-0.404967 13:-0.5488 14:-0.66335 15:-0.270804 16:-0.24636 17:-0.572847 18:-0.631756 19:0.0859004 20:-0.0457408 21:0.0823452 22:0.0895166 23:-0.44821 24:0.0739893 25:0.158515 26:-0.223082 27:-0.372008 28:-0.244639 29:0.304454 30:-0.494702 31:-0.258699 32:-0.567399 33:0.253501 34:-0.359953 35:-0.278008 36:-0.519899
+1024 1:-0.746229 2:-0.308545 3:-0.459704 4:-0.0570002 5:-0.69172 6:-0.3858 7:-0.462152 8:0.273247 9:-0.73619 10:-0.281894 11:-0.312518 12:-0.171543 13:-0.662534 14:-0.690487 15:-0.315044 16:-0.182818 17:-0.657022 18:-0.697187 19:0.0365741 20:0.146092 21:0.178816 22:-0.091615 23:-0.242519 24:0.197598 25:0.139444 26:0.17998 27:-0.284832 28:0.415604 29:0.385984 30:-0.446716 31:-0.0996721 32:-0.401107 33:0.37834 34:-0.574748 35:-0.0519409 36:-0.432059
+-1024 1:-0.616654 2:-0.252241 3:-0.315436 4:0.430892 5:-0.830411 6:-0.258888 7:-0.289474 8:-0.0444552 9:-0.726411 10:-0.518686 11:-0.266786 12:-0.238872 13:-0.632293 14:-0.682561 15:-0.279957 16:0.0509313 17:-0.680035 18:-0.625467 19:0.212651 20:-0.0515315 21:0.152934 22:0.42965 23:-0.650388 24:0.100352 25:0.21334 26:-0.346916 27:-0.426673 28:-0.441241 29:0.299358 30:-0.272365 31:-0.356395 32:-0.543492 33:0.268787 34:-0.109969 35:-0.391976 36:-0.495706
+-1024 1:-0.83648 2:-0.40995 3:-0.605385 4:-0.0156412 5:-0.767936 6:-0.471016 7:-0.599444 8:0.123508 9:-0.770424 10:-0.452433 11:-0.496963 12:-0.159622 13:-0.720077 14:-0.756449 15:-0.501156 16:-0.120737 17:-0.72441 18:-0.754156 19:-0.331225 20:-0.00141577 21:-0.0635357 22:-0.0792481 23:-0.364249 24:-0.00720316 25:-0.0512578 26:0.00861088 27:-0.357198 28:0.0422094 29:0.0675141 30:-0.389982 31:-0.217336 32:-0.494604 33:0.0547751 34:-0.359163 35:-0.225892 36:-0.486289
+-1024 1:-0.863597 2:-0.546126 3:-0.683177 4:-0.0253044 5:-0.841868 6:-0.608568 7:-0.658891 8:-0.0268938 9:-0.835675 10:-0.676526 11:-0.608241 12:-0.133451 13:-0.799283 14:-0.836291 15:-0.609468 16:-0.133399 17:-0.796901 18:-0.840698 19:-0.439881 20:-0.391194 21:-0.345886 22:0.190417 23:-0.744831 24:-0.381898 25:-0.334928 26:-0.113807 27:-0.625886 28:-0.554709 29:-0.258601 30:-0.249914 31:-0.601081 32:-0.776532 33:-0.27134 34:-0.311776 35:-0.581983 36:-0.785922
+1024 1:-0.908927 2:-0.360169 3:-0.704392 4:0.20252 5:-0.802858 6:-0.297943 7:-0.692859 8:-0.0795915 9:-0.736975 10:-0.547644 11:-0.640253 12:-0.183437 13:-0.688569 14:-0.743379 15:-0.647606 16:-0.0984877 17:-0.700218 18:-0.728153 19:-0.67093 20:-0.136452 21:-0.371768 22:0.225356 23:-0.564417 24:0.0555121 25:-0.323009 26:-0.404194 27:-0.340873 28:-0.452907 29:-0.281532 30:-0.527108 31:-0.252713 32:-0.632829 33:-0.307007 34:-0.325054 35:-0.313651 36:-0.576566
+-1024 1:-0.667156 2:-0.415437 3:-0.4102 4:0.0850168 5:-0.847603 6:-0.569111 7:-0.415444 8:0.0668707 9:-0.823767 10:-0.610714 11:-0.403978 12:0.118558 13:-0.783574 14:-0.721863 15:-0.400473 16:0.0253759 17:-0.766326 18:-0.74254 19:0.0278327 20:-0.240576 21:-0.0329473 22:0.142036 23:-0.697936 24:-0.350611 25:-0.0298035 26:0.0222759 27:-0.644128 28:-0.439577 29:0.0420361 30:0.169667 31:-0.610625 32:-0.585672 33:0.0471317 34:0.130486 35:-0.602403 36:-0.595801
+-1024 1:-0.729049 2:-0.182534 3:-0.366353 4:0.669219 5:-0.80762 6:0.0611978 7:-0.324859 8:-0.318396 9:-0.596587 10:-0.508119 11:-0.263738 12:-0.416557 13:-0.53969 14:-0.659317 15:-0.272329 16:-0.29495 17:-0.55856 18:-0.637737 19:-0.0801924 20:0.0952162 21:0.150581 22:0.645071 23:-0.590982 24:0.57796 25:0.184737 26:-0.856933 27:-0.0252656 28:-0.438093 29:0.230571 30:-0.714984 31:-0.00136669 32:-0.478749 33:0.235666 34:-0.593652 35:-0.0497603 36:-0.451489
+1024 1:-0.787215 2:-0.657496 3:-0.626601 4:0.0300539 5:-0.935775 6:-0.773514 7:-0.626337 8:0.182157 9:-0.936782 10:-0.755782 11:-0.608241 12:0.201953 13:-0.90788 14:-0.857477 15:-0.60184 16:0.116174 17:-0.899004 18:-0.876432 19:-0.360575 20:-0.485387 21:-0.35765 22:0.0360916 23:-0.80362 24:-0.658701 25:-0.401672 26:0.194572 27:-0.80755 28:-0.579993 29:-0.355415 30:0.221455 31:-0.768628 32:-0.74249 33:-0.352867 34:-0.017389 35:-0.718333 36:-0.790052
+-1024 1:-0.770655 2:-0.451268 3:-0.53608 4:0.0504871 5:-0.819086 6:-0.522814 7:-0.534337 8:0.138044 9:-0.818084 10:-0.542307 11:-0.413125 12:-0.0550347 13:-0.779732 14:-0.78083 15:-0.415728 16:-0.0366798 17:-0.779482 18:-0.780827 19:-0.0614589 20:-0.113032 21:0.0305825 22:0.0740606 23:-0.559646 24:-0.159393 25:0.0250216 26:0.0629339 27:-0.54076 28:-0.21419 29:0.143945 30:-0.150753 31:-0.438448 32:-0.565949 33:0.133754 34:-0.182967 35:-0.427847 36:-0.572376
+-1024 1:-0.672538 2:-0.111831 3:-0.339481 4:0.177885 5:-0.660898 6:-0.119293 7:-0.353167 8:0.359132 9:-0.696151 10:-0.142959 11:-0.318615 12:-0.189381 13:-0.493318 14:-0.511895 15:-0.311993 16:-0.320651 17:-0.457804 18:-0.547718 19:0.0759107 20:0.162334 21:0.164699 22:-0.0587537 23:-0.210723 24:0.300455 25:0.141828 26:-0.105811 27:-0.260581 28:0.0674484 29:0.235666 30:-0.296273 31:-0.0778436 32:-0.301514 33:0.235666 34:-0.361549 35:-0.0449005 36:-0.31213
+-1024 1:-0.851177 2:-0.461841 3:-0.633673 4:-0.0922779 5:-0.778118 6:-0.552853 7:-0.636245 8:0.0945735 9:-0.793314 10:-0.513739 11:-0.527451 12:-0.180127 13:-0.749543 14:-0.796437 15:-0.533193 16:-0.148556 17:-0.750799 18:-0.794193 19:-0.386802 20:-0.102101 21:-0.112948 22:-0.166562 23:-0.396884 24:-0.179597 25:-0.189516 26:0.101799 27:-0.449604 28:-0.00350168 29:0.0114625 30:-0.475941 31:-0.310479 32:-0.621247 33:-0.00382424 34:-0.438167 35:-0.313725 36:-0.606585
+-1024 1:-0.773139 2:-0.357191 3:-0.509207 4:0.0955436 5:-0.765355 6:-0.380044 7:-0.504613 8:0.105304 9:-0.748807 10:-0.438446 11:-0.373491 12:-0.0245024 13:-0.725383 14:-0.705692 15:-0.371487 16:-0.0808869 17:-0.716674 18:-0.724876 19:-0.050843 20:0.0365626 21:0.103522 22:0.00632994 23:-0.367973 24:0.109679 25:0.129909 26:-0.0996587 27:-0.330037 28:-0.0397637 29:0.197449 30:0.001309 31:-0.293412 32:-0.333915 33:0.240762 34:-0.392659 35:-0.196984 36:-0.46653
+-451.4227057833912 1:-0.848693 2:-0.427871 3:-0.596898 4:-0.164652 5:-0.727544 6:-0.530898 7:-0.616429 8:-0.00944076 9:-0.720082 10:-0.468046 11:-0.478671 12:-0.333701 13:-0.685659 14:-0.784352 15:-0.484375 16:-0.309706 17:-0.686262 18:-0.784983 19:-0.30812 20:-0.12157 21:-0.112948 22:0.0819697 23:-0.49602 24:-0.0269203 25:-0.0893986 26:0.0374299 27:-0.420615 28:-0.0331837 29:0.143945 30:-0.524933 31:-0.3344 32:-0.652898 33:0.16178 34:-0.622884 35:-0.316394 36:-0.680379
+-72.14470042253261 1:-0.892161 2:-0.364828 3:-0.670448 4:-0.118063 5:-0.653659 6:-0.360333 7:-0.667384 8:-0.118255 9:-0.645268 10:-0.435108 11:-0.571656 12:-0.41588 13:-0.599531 14:-0.745396 15:-0.57438 16:-0.387363 17:-0.59984 18:-0.744091 19:-0.575389 20:0.0102012 21:-0.232946 22:-0.112727 23:-0.18531 24:0.237278 25:-0.253879 26:-0.148492 27:-0.211567 28:0.0555237 29:-0.100641 30:-0.631208 31:-0.0785847 32:-0.53022 33:-0.100641 34:-0.595257 35:-0.0868051 36:-0.516652
+1024 1:-0.900026 2:-0.455888 3:-0.708635 4:-0.104325 5:-0.761935 6:-0.527505 7:-0.716921 8:-0.0601481 9:-0.741099 10:-0.537154 11:-0.629582 12:-0.307887 13:-0.704642 14:-0.807066 15:-0.635402 16:-0.252828 17:-0.709989 18:-0.800489 19:-0.604738 20:-0.261805 21:-0.296476 22:-0.197645 23:-0.472442 24:-0.35422 25:-0.323009 26:-0.0374917 27:-0.516844 28:-0.301064 29:-0.159237 30:-0.547529 31:-0.393129 32:-0.732072 33:-0.17962 34:-0.449131 35:-0.421154 36:-0.711791
+-1024 1:-0.704623 2:-0.302099 3:-0.411614 4:0.152684 5:-0.762511 6:-0.337836 7:-0.416859 8:0.203801 9:-0.750928 10:-0.375325 11:-0.266786 12:0.0232859 13:-0.712331 14:-0.669267 15:-0.263176 16:-0.0446087 17:-0.704079 18:-0.693429 19:0.164573 20:0.0832804 21:0.249403 22:0.0282783 23:-0.384872 24:0.110537 25:0.241945 26:-0.0665018 27:-0.336383 28:-0.00493622 29:0.281526 30:0.0612008 31:-0.257072 32:-0.25568 33:0.335028 34:-0.299765 35:-0.173602 36:-0.388422
+-1024 1:-0.80067 2:-0.482278 3:-0.564369 4:-0.0697831 5:-0.815362 6:-0.607009 7:-0.547075 8:0.156086 9:-0.847429 10:-0.58672 11:-0.474098 12:-0.168215 13:-0.779732 14:-0.820687 15:-0.484375 16:0.0181135 17:-0.80644 18:-0.794373 19:-0.216324 20:-0.268894 21:-0.183534 22:0.0506901 23:-0.614319 24:-0.291339 25:-0.103701 26:0.0556069 27:-0.622594 28:-0.370276 29:-0.00382424 30:-0.284496 31:-0.529416 32:-0.716093 33:-0.019111 34:-0.150092 35:-0.553992 36:-0.680196
+1024 1:-0.704209 2:-0.202848 3:-0.36211 4:0.0770199 5:-0.660303 6:-0.214432 7:-0.355997 8:0.220191 9:-0.668147 10:-0.209302 11:-0.172277 12:-0.460705 13:-0.561793 14:-0.692067 15:-0.182324 16:-0.342609 17:-0.573164 18:-0.667242 19:0.105256 20:0.053034 21:0.221167 22:-0.0116027 23:-0.341028 24:0.136152 25:0.156131 26:0.151037 27:-0.33119 28:0.285324 29:0.52611 30:-0.989609 31:-0.0637468 32:-0.632393 33:0.487893 34:-0.667562 35:-0.142807 36:-0.546207
+1024 1:-0.75306 2:-0.290906 3:-0.480919 4:0.230993 5:-0.743584 6:-0.209458 7:-0.45366 8:-0.0247342 9:-0.672176 10:-0.409642 11:-0.32776 12:-0.177703 13:-0.649303 14:-0.679171 15:-0.328772 16:-0.179323 17:-0.645768 18:-0.684328 19:-0.0077573 20:0.16479 21:0.141169 22:0.138404 23:-0.300212 24:0.43358 25:0.160898 26:-0.268364 27:-0.124736 28:0.0765736 29:0.317193 30:-0.446103 31:-0.0638045 32:-0.373422 33:0.314645 34:-0.442569 35:-0.0625835 36:-0.371164
+-1024 1:-0.845588 2:-0.416457 3:-0.623772 4:-0.168732 5:-0.68992 6:-0.472508 7:-0.613598 8:0.0786979 9:-0.760062 10:-0.467472 11:-0.507634 12:-0.322458 13:-0.681817 14:-0.779077 15:-0.504207 16:-0.383698 17:-0.667025 18:-0.795402 19:-0.243801 20:-0.106479 21:-0.0423591 22:-0.0424455 23:-0.411378 24:-0.0386812 25:-0.072712 26:-0.120465 27:-0.363573 28:-0.132302 29:0.123563 30:-0.650332 31:-0.237737 32:-0.637275 33:0.136302 34:-0.817879 35:-0.195142 36:-0.679713
+1024 1:-0.818679 2:-0.525972 3:-0.619529 4:0.0785524 5:-0.871171 6:-0.595933 7:-0.610767 8:0.0936866 9:-0.848105 10:-0.623031 11:-0.533548 12:0.0314012 13:-0.828726 14:-0.810888 15:-0.539295 16:0.0271617 17:-0.825337 18:-0.814598 19:-0.29688 20:-0.214118 21:-0.162357 22:-0.0530016 23:-0.5647 24:-0.323937 25:-0.160913 26:-0.153279 27:-0.479446 28:-0.357973 29:-0.131212 30:-0.11696 31:-0.480757 32:-0.601779 33:-0.12102 34:-0.247662 35:-0.450033 36:-0.637783
+-1024 1:-0.704209 2:-0.202848 3:-0.36211 4:0.0770199 5:-0.660303 6:-0.214432 7:-0.355997 8:0.220191 9:-0.668147 10:-0.209302 11:-0.172277 12:-0.460705 13:-0.561793 14:-0.692067 15:-0.182324 16:-0.342609 17:-0.573164 18:-0.667242 19:0.105256 20:0.053034 21:0.221167 22:-0.0116027 23:-0.341028 24:0.136152 25:0.156131 26:0.151037 27:-0.33119 28:0.285324 29:0.52611 30:-0.989609 31:-0.0637468 32:-0.632393 33:0.487893 34:-0.667562 35:-0.142807 36:-0.546207
+1024 1:-0.873118 2:-0.473545 3:-0.674691 4:-0.138794 5:-0.753881 6:-0.547875 7:-0.664553 8:-0.0814736 9:-0.728877 10:-0.537347 11:-0.574705 12:-0.273427 13:-0.708757 14:-0.792833 15:-0.572855 16:-0.302623 17:-0.698593 18:-0.80181 19:-0.395545 20:-0.127348 21:-0.105889 22:-0.216573 23:-0.379527 24:-0.213766 25:-0.153761 26:-0.056933 27:-0.371377 28:-0.0720907 29:0.0496795 30:-0.543074 31:-0.295564 32:-0.63803 33:0.0267493 34:-0.453299 35:-0.316928 36:-0.614141
+-1024 1:-0.767964 2:-0.310913 3:-0.442731 4:0.349664 5:-0.772533 6:-0.16375 7:-0.384305 8:-0.432938 9:-0.632648 10:-0.631433 11:-0.288128 12:-0.557733 13:-0.603356 14:-0.772225 15:-0.293687 16:-0.456775 17:-0.617959 18:-0.759705 19:-0.0133765 20:-0.0775153 21:0.138816 22:0.548127 23:-0.61353 24:0.366644 25:0.225259 26:-0.544795 27:-0.326681 28:-0.488984 29:0.434392 30:-0.873587 31:-0.239264 32:-0.709532 33:0.421653 34:-0.708168 35:-0.280771 36:-0.67298
+-705.7017307163042 1:-0.814332 2:-0.359359 3:-0.530423 4:0.188077 5:-0.746906 6:-0.248154 7:-0.467813 8:-0.433317 9:-0.635318 10:-0.641418 11:-0.375015 12:-0.560218 13:-0.611549 14:-0.787231 15:-0.379115 16:-0.488278 17:-0.620659 18:-0.779712 19:-0.128276 20:-0.117846 21:0.039992 22:0.436473 23:-0.571205 24:0.315463 25:0.189504 26:-0.586976 27:-0.308892 28:-0.508381 29:0.421653 30:-0.967778 31:-0.240281 32:-0.746452 33:0.391079 34:-0.879672 35:-0.252227 36:-0.723076
+-1024 1:-0.832547 2:-0.677476 3:-0.681763 4:-0.0319573 5:-0.927724 6:-0.786709 7:-0.674461 8:0.0409957 9:-0.92304 10:-0.796668 11:-0.64635 12:0.135823 13:-0.905717 14:-0.874248 15:-0.647606 16:0.0736757 17:-0.897178 18:-0.886918 19:-0.318112 20:-0.521383 21:-0.270593 22:0.214619 23:-0.862515 24:-0.620978 25:-0.26103 26:0.191302 27:-0.839351 28:-0.65056 29:-0.263697 30:0.282281 31:-0.807926 32:-0.762667 33:-0.256054 34:0.10776 35:-0.778551 36:-0.797912
+-1024 1:-0.655772 2:-0.571082 3:-0.390398 4:-0.00261497 5:-0.91399 6:-0.763084 7:-0.384305 8:0.12686 9:-0.917991 10:-0.762921 11:-0.349101 12:0.127273 13:-0.881111 14:-0.845155 15:-0.348604 16:0.0702751 17:-0.874626 18:-0.857732 19:0.0915202 20:-0.437473 21:-0.0329473 22:0.185076 23:-0.830616 24:-0.584681 25:-0.0178845 26:0.156724 27:-0.810944 28:-0.624673 29:0.0675141 30:0.288564 31:-0.781047 32:-0.729053 33:0.0649663 34:0.13313 35:-0.755526 36:-0.760019
+-1024 1:-0.58478 2:-0.2174 3:-0.270175 4:0.445834 5:-0.820223 6:-0.228456 7:-0.231444 8:-0.0382471 9:-0.714186 10:-0.495797 11:-0.207336 12:-0.24823 13:-0.615077 14:-0.665437 15:-0.221987 16:0.0518968 17:-0.663745 18:-0.604274 19:0.255109 20:-0.0422384 21:0.17411 22:0.413491 23:-0.638165 24:0.105914 25:0.244329 26:-0.352861 27:-0.419721 28:-0.43484 29:0.33248 30:-0.270351 31:-0.351243 32:-0.536349 33:0.312097 34:-0.104922 35:-0.387631 36:-0.48742
+1024 1:-0.734638 2:-0.312086 3:-0.434245 4:0.669657 5:-0.876487 6:-0.121648 7:-0.397044 8:-0.386611 9:-0.661174 10:-0.640536 11:-0.349101 12:-0.315125 13:-0.646308 14:-0.730457 15:-0.35013 16:-0.357698 17:-0.635428 18:-0.744226 19:-0.0664544 20:-0.0679238 21:0.0729334 22:0.683546 23:-0.719743 24:0.290571 25:0.0893843 26:-0.761879 27:-0.239403 28:-0.59137 29:0.16178 30:-0.507018 31:-0.267714 32:-0.590171 33:0.159232 34:-0.596595 35:-0.244585 36:-0.614553
+-1024 1:-0.91969 2:-0.538607 3:-0.755311 4:-0.130888 5:-0.794956 6:-0.603582 7:-0.742399 8:-0.115472 9:-0.794091 10:-0.664696 11:-0.676838 12:-0.260133 13:-0.761014 14:-0.847119 15:-0.679642 16:-0.244164 17:-0.759383 18:-0.84797 19:-0.627219 20:-0.38612 21:-0.400004 22:0.0708809 23:-0.669974 24:-0.372517 25:-0.387369 26:-0.173181 27:-0.569459 28:-0.534451 29:-0.291723 30:-0.405851 31:-0.535053 32:-0.789849 33:-0.278984 34:-0.481337 35:-0.513316 36:-0.801387
+1024 1:-0.839999 2:-0.388623 3:-0.613871 4:0.00583807 5:-0.71669 6:-0.353787 7:-0.579628 8:-0.217733 9:-0.653757 10:-0.527336 11:-0.480195 12:-0.335813 13:-0.645542 14:-0.74704 15:-0.481324 16:-0.338115 17:-0.639887 18:-0.751427 19:-0.263159 20:0.00868083 21:-0.0400061 22:0.0455079 23:-0.3223 24:0.249214 25:-0.0298035 26:-0.332605 27:-0.149797 28:-0.0633539 29:0.187258 30:-0.619206 31:-0.147548 32:-0.547544 33:0.187258 34:-0.645045 35:-0.134374 36:-0.550311
+1024 1:-0.630107 2:-0.290045 3:-0.32958 4:0.150342 5:-0.800985 6:-0.43041 7:-0.337598 8:0.418606 9:-0.828679 10:-0.386427 11:-0.285079 12:-0.0384826 13:-0.692292 14:-0.67037 15:-0.290636 16:-0.0516063 17:-0.6856 18:-0.675476 19:0.175812 20:-0.0960019 21:0.110581 22:0.0217487 23:-0.568677 24:-0.238096 25:0.13706 26:0.0639509 27:-0.57778 28:-0.277932 29:0.192354 30:-0.0338286 31:-0.435769 32:-0.501518 33:0.189806 34:-0.0401309 35:-0.437515 36:-0.506898
+-1024 1:-0.725116 2:-0.180726 3:-0.364939 4:0.674893 5:-0.807738 6:0.0666459 7:-0.320613 8:-0.310652 9:-0.598498 10:-0.505341 11:-0.260689 12:-0.408173 13:-0.541145 14:-0.657125 15:-0.269278 16:-0.288098 17:-0.559042 18:-0.635127 19:-0.0776947 20:0.0973514 21:0.148228 22:0.644671 23:-0.588782 24:0.583702 25:0.182353 26:-0.851825 27:-0.0260802 28:-0.434646 29:0.230571 30:-0.708121 31:-0.00239652 32:-0.475861 33:0.235666 34:-0.595918 35:-0.0473996 36:-0.450803
+-1024 1:-0.8874 2:-0.684483 3:-0.748239 4:-0.106149 5:-0.90871 6:-0.788414 7:-0.74523 8:-0.0424377 9:-0.892079 10:-0.779373 11:-0.698177 12:-0.0316358 13:-0.880931 14:-0.894676 15:-0.697947 16:-0.0377478 17:-0.879006 18:-0.89989 19:-0.501078 20:-0.462263 21:-0.350591 22:-0.0183933 23:-0.74248 24:-0.600388 25:-0.392137 26:0.0571633 27:-0.718091 28:-0.543437 29:-0.309555 30:-0.0305098 31:-0.708289 32:-0.784937 33:-0.291723 34:-0.196076 35:-0.675534 36:-0.817731
+1024 1:-0.900026 2:-0.455888 3:-0.708635 4:-0.104325 5:-0.761935 6:-0.527505 7:-0.716921 8:-0.0601481 9:-0.741099 10:-0.537154 11:-0.629582 12:-0.307887 13:-0.704642 14:-0.807066 15:-0.635402 16:-0.252828 17:-0.709989 18:-0.800489 19:-0.604738 20:-0.261805 21:-0.296476 22:-0.197645 23:-0.472442 24:-0.35422 25:-0.323009 26:-0.0374917 27:-0.516844 28:-0.301064 29:-0.159237 30:-0.547529 31:-0.393129 32:-0.732072 33:-0.17962 34:-0.449131 35:-0.421154 36:-0.711791
+1024 1:-0.877465 2:-0.532943 3:-0.705806 4:0.11754 5:-0.874288 6:-0.549802 7:-0.677291 8:-0.211437 9:-0.78553 10:-0.715816 11:-0.644826 12:0.0114779 13:-0.813603 14:-0.802628 15:-0.639978 16:-0.25804 17:-0.763258 18:-0.852093 19:-0.479847 20:-0.346958 21:-0.308241 22:0.159575 23:-0.712296 24:-0.351775 25:-0.301555 26:-0.46907 27:-0.455604 28:-0.64518 29:-0.281532 30:0.0169687 31:-0.61467 32:-0.672238 33:-0.240767 34:-0.449284 35:-0.51056 36:-0.783691
+1024 1:-0.626174 2:-0.0911177 3:-0.243303 4:0.869623 5:-0.829507 6:0.171001 7:-0.186151 8:-0.435673 9:-0.523181 10:-0.482469 11:-0.184472 12:-0.470558 13:-0.442164 14:-0.580759 15:-0.186901 16:-0.34777 17:-0.465572 18:-0.559927 19:0.105256 20:0.124807 21:0.22352 22:0.673235 23:-0.58417 24:0.62907 25:0.2634 26:-0.856742 27:-0.0242302 28:-0.424106 29:0.337575 30:-0.664042 31:-0.000163618 32:-0.440779 33:0.340123 34:-0.53477 35:-0.049146 36:-0.410297
+1024 1:-0.789492 2:-0.4949 3:-0.578511 4:0.113794 5:-0.867203 6:-0.566603 7:-0.565476 8:0.121762 9:-0.844627 10:-0.602338 11:-0.498488 12:0.0267093 13:-0.81192 14:-0.791797 15:-0.499631 16:0.0182331 17:-0.809049 18:-0.797718 19:-0.230687 20:-0.190012 21:-0.117654 22:-0.0668918 23:-0.560517 24:-0.330859 25:-0.110853 26:-0.144694 27:-0.490284 28:-0.36146 29:-0.0980927 30:-0.0737511 31:-0.465781 32:-0.563238 33:-0.0955449 34:-0.215585 35:-0.427352 36:-0.6001
+-1024 1:-0.886779 2:-0.50928 3:-0.695906 4:-0.122314 5:-0.797042 6:-0.605406 7:-0.687199 8:0.00887885 9:-0.815467 10:-0.613632 11:-0.612814 12:-0.194959 13:-0.773771 14:-0.831515 15:-0.617095 16:-0.0834416 17:-0.790715 18:-0.817489 19:-0.465484 20:-0.280297 21:-0.26824 22:0.0128756 23:-0.581041 24:-0.277299 25:-0.208586 26:-0.0409192 27:-0.567564 28:-0.381314 29:-0.0726148 30:-0.387992 31:-0.506272 32:-0.743905 33:-0.100641 34:-0.243123 35:-0.527265 36:-0.701494
+1024 1:-0.628451 2:-0.259109 3:-0.295634 4:0.42416 5:-0.81491 6:-0.229814 7:-0.29089 8:0.159132 9:-0.760429 10:-0.439032 11:-0.167704 12:0.0241809 13:-0.705986 14:-0.660061 15:-0.164018 16:-0.00390465 17:-0.701937 18:-0.673114 19:0.177061 20:0.0623525 21:0.242344 22:0.378989 23:-0.558553 24:0.239562 25:0.253864 26:-0.248356 27:-0.348994 28:-0.230373 29:0.335028 30:-0.0434226 31:-0.304871 32:-0.368611 33:0.347767 34:-0.25736 35:-0.248371 36:-0.433886
+-1024 1:-0.900854 2:-0.51846 3:-0.721365 4:-0.0536737 5:-0.820237 6:-0.585379 7:-0.704183 8:-0.0600939 9:-0.809582 10:-0.651874 11:-0.64635 12:-0.205568 13:-0.771216 14:-0.835262 15:-0.647606 16:-0.184689 17:-0.772352 18:-0.83658 19:-0.534174 20:-0.377995 21:-0.345886 22:0.144087 23:-0.710472 24:-0.365052 25:-0.334928 26:-0.132147 27:-0.601829 28:-0.53625 29:-0.235671 30:-0.337211 31:-0.567282 32:-0.782593 33:-0.235671 34:-0.437151 35:-0.535698 36:-0.796661
+-1024 1:-0.541937 2:-0.0781323 3:-0.181069 4:0.0969484 5:-0.669665 6:-0.231169 7:-0.186151 8:0.348286 9:-0.692073 10:-0.167615 11:-0.137216 12:-0.0886255 13:-0.532774 14:-0.501409 15:-0.141135 16:-0.0352273 17:-0.537448 18:-0.490066 19:0.223889 20:0.144426 21:0.256462 22:-0.115976 23:-0.339797 24:0.000988866 25:0.230026 26:0.0428746 27:-0.353498 28:0.100602 29:0.340123 30:-0.0839895 31:-0.199357 32:-0.284341 33:0.350314 34:-0.0379946 35:-0.218463 36:-0.276311
+1024 1:-0.534486 2:-0.154991 3:-0.183898 4:0.25229 5:-0.745527 6:-0.243761 7:-0.194643 8:0.589061 9:-0.785129 10:-0.170171 11:-0.112826 12:-0.0754446 13:-0.616261 14:-0.592849 15:-0.109099 16:-0.0350564 17:-0.623547 18:-0.591873 19:0.444922 20:0.163405 21:0.399989 22:-0.0534863 23:-0.301581 24:0.153992 25:0.373055 26:0.206883 27:-0.389595 28:0.245725 29:0.523562 30:-0.317733 31:-0.127702 32:-0.346177 33:0.538849 34:-0.333583 35:-0.124197 36:-0.352191
+1024 1:-0.652874 2:-0.183958 3:-0.322508 4:0.210018 5:-0.703885 6:-0.177094 7:-0.30929 8:0.0758545 9:-0.672232 10:-0.339348 11:-0.20886 12:-0.424872 13:-0.551454 14:-0.670532 15:-0.218936 16:-0.281528 17:-0.571518 18:-0.642554 19:0.0821544 20:-0.0406568 21:0.0988159 22:0.147479 23:-0.473202 24:0.104689 25:0.160898 26:-0.211603 27:-0.377688 28:-0.240769 29:0.294263 30:-0.497328 31:-0.250816 32:-0.562776 33:0.253501 34:-0.396803 35:-0.262728 36:-0.525377
+-1024 1:-0.648528 2:-0.243572 3:-0.338066 4:0.149905 5:-0.767905 6:-0.361363 7:-0.339013 8:0.401268 9:-0.782125 10:-0.293694 11:-0.278981 12:-0.0752361 13:-0.654015 14:-0.640631 15:-0.281482 16:-0.0689422 17:-0.654253 18:-0.646229 19:0.0665449 20:0.0296368 21:0.171757 22:-0.114165 23:-0.431989 24:-0.156171 25:0.0536273 26:0.186936 27:-0.463326 28:0.0812608 29:0.233118 30:-0.192769 31:-0.294419 32:-0.446614 33:0.215284 34:-0.156372 35:-0.29746 36:-0.430449
+1024 1:-0.781419 2:-0.24759 3:-0.448389 4:0.486037 5:-0.789149 6:-0.0588135 7:-0.409782 8:-0.255584 9:-0.643316 10:-0.5337 11:-0.323187 12:-0.359498 13:-0.601796 14:-0.70166 15:-0.334875 16:-0.235689 17:-0.619045 18:-0.679331 19:-0.180731 20:0.0683651 21:0.09411 22:0.556638 23:-0.552414 24:0.54473 25:0.151363 26:-0.796918 27:-0.0501986 28:-0.414333 29:0.210188 30:-0.78033 31:-0.0164805 32:-0.526683 33:0.228023 34:-0.693069 35:-0.0591003 36:-0.512706
+-13.17770775461184 1:-0.990479 2:-0.696384 3:-0.933523 4:-0.477432 5:-0.744398 6:-0.846843 7:-0.940553 8:-0.445036 9:-0.700477 10:-0.825229 11:-0.911588 12:-0.528373 13:-0.703227 14:-0.941205 15:-0.916097 16:-0.494293 17:-0.70484 18:-0.940703 19:-0.963784 20:-0.557382 21:-0.774119 22:-0.25033 23:-0.544781 24:-0.618734 25:-0.809298 26:-0.147941 27:-0.577615 28:-0.599818 29:-0.696817 30:-0.576649 31:-0.524825 32:-0.901628 33:-0.709554 34:-0.566001 35:-0.51752 36:-0.895543
+-1024 1:-0.998137 2:-0.750941 3:-0.971712 4:-0.441662 5:-0.779606 6:-0.870879 7:-0.96603 8:-0.49684 9:-0.699605 10:-0.875355 11:-0.958842 12:-0.493496 13:-0.711642 14:-0.953381 15:-0.960336 16:-0.450205 17:-0.724134 18:-0.954058 19:-0.981268 20:-0.628803 21:-0.823531 22:-0.202049 23:-0.650134 24:-0.712501 25:-0.852207 26:-0.185183 27:-0.605856 28:-0.692679 29:-0.76051 30:-0.465428 31:-0.60417 32:-0.917322 33:-0.757962 34:-0.450235 35:-0.631941 36:-0.924536
+-1024 1:-0.896093 2:-0.431611 3:-0.691662 4:-0.173135 5:-0.690923 6:-0.474947 7:-0.678707 8:-0.213258 9:-0.644915 10:-0.515005 11:-0.585375 12:-0.453293 13:-0.632625 14:-0.790945 15:-0.58811 16:-0.42459 17:-0.633643 18:-0.790589 19:-0.433636 20:-0.116861 21:-0.171769 22:-0.187957 23:-0.272419 24:-0.0135128 25:-0.220505 26:0.164157 27:-0.479105 28:0.0252788 29:0.0777053 30:-0.642494 31:-0.285635 32:-0.67312 33:0.0853486 34:-0.641853 35:-0.292427 36:-0.676852
+1024 1:-0.679162 2:-0.331144 3:-0.387569 4:0.0789903 5:-0.762146 6:-0.404554 7:-0.389967 8:0.266687 9:-0.773039 10:-0.373784 11:-0.230201 12:-0.119931 13:-0.707925 14:-0.71827 15:-0.23114 16:-0.122693 17:-0.705539 18:-0.723915 19:0.250114 20:0.0690691 21:0.228226 22:0.0693471 23:-0.381544 24:0.172996 25:0.253864 26:0.200635 27:-0.418086 28:0.183821 29:0.431844 30:-0.40163 31:-0.214179 32:-0.475388 33:0.442035 34:-0.50861 35:-0.186635 36:-0.506333
+1024 1:-0.876223 2:-0.534948 3:-0.681763 4:-0.141172 5:-0.797958 6:-0.623485 7:-0.681538 8:-0.115208 9:-0.775521 10:-0.636582 11:-0.589949 12:-0.253773 13:-0.756407 14:-0.832687 15:-0.592687 16:-0.227478 17:-0.757984 18:-0.832919 19:-0.358077 20:-0.223737 21:-0.169416 22:-0.0108783 23:-0.532245 24:-0.216476 25:-0.179981 26:0.00672295 27:-0.523645 28:-0.253476 29:0.0675141 30:-0.391995 31:-0.466311 32:-0.705019 33:0.0751575 34:-0.407331 35:-0.466365 36:-0.710589
+-1024 1:-0.730498 2:-0.483893 3:-0.507792 4:0.0917063 5:-0.866324 6:-0.592596 7:-0.504613 8:0.112907 9:-0.853699 10:-0.632182 11:-0.463427 12:0.10059 13:-0.81883 14:-0.773813 15:-0.463018 16:0.0361757 17:-0.805595 18:-0.786609 19:-0.0683277 20:-0.2618 21:-0.0705945 22:0.152422 23:-0.707348 24:-0.357473 25:-0.072712 26:0.0131003 27:-0.64508 28:-0.451118 29:-0.019111 30:0.136986 31:-0.61313 32:-0.605336 33:-0.0267544 34:0.0954984 35:-0.599663 36:-0.611101
+-1024 1:-0.977439 2:-0.554763 3:-0.878362 4:-0.626796 5:-0.489133 6:-0.697592 7:-0.895261 8:-0.349335 9:-0.60362 10:-0.616367 11:-0.850613 12:-0.738563 13:-0.509053 14:-0.879369 15:-0.852025 16:-0.741123 17:-0.499303 18:-0.885278 19:-0.879487 20:-0.410555 21:-0.595296 22:-0.445943 23:-0.275879 24:-0.460998 25:-0.656737 26:-0.22383 27:-0.373318 28:-0.369723 29:-0.513376 30:-0.974706 31:-0.217464 32:-0.851967 33:-0.510828 34:-1 35:-0.217768 36:-0.859282
+-1024 1:-0.689719 2:-0.317143 3:-0.430001 4:0.253038 5:-0.802153 6:-0.329264 7:-0.452245 8:-0.0227167 9:-0.715423 10:-0.480222 11:-0.356723 12:-0.314516 13:-0.645263 14:-0.729661 15:-0.369962 16:-0.275718 17:-0.640775 18:-0.719713 19:0.102759 20:-0.118295 21:0.039992 22:0.165082 23:-0.556943 24:-0.0386361 25:0.0417082 26:-0.0643554 27:-0.488823 28:-0.264568 29:0.240762 30:-0.394718 31:-0.369876 32:-0.616782 33:0.225475 34:-0.325618 35:-0.38053 36:-0.594416
+1024 1:-0.818472 2:-0.204189 3:-0.516279 4:0.188217 5:-0.689039 6:-0.130365 7:-0.514521 8:0.204065 9:-0.690182 10:-0.242626 11:-0.405503 12:-0.170726 13:-0.598649 14:-0.62457 15:-0.4081 16:-0.146386 17:-0.599904 18:-0.62412 19:-0.333723 20:0.107387 21:0.0352861 22:-0.0608201 23:-0.211072 24:0.29152 25:0.032173 26:-0.168192 27:-0.226909 28:0.0343842 29:0.136302 30:-0.607598 31:0.0192941 32:-0.411325 33:0.126111 34:-0.625842 35:0.03475 36:-0.410768
+-1024 1:-0.593266 2:-0.211112 3:-0.241888 4:0.905898 5:-0.883187 6:0.026249 7:-0.214459 8:-0.458847 9:-0.605342 10:-0.602543 11:-0.126546 12:-0.68571 13:-0.540455 14:-0.749995 15:-0.136558 16:-0.427495 17:-0.582014 18:-0.706108 19:0.314426 20:-0.0408532 21:0.315285 22:0.928483 23:-0.775019 24:0.432064 25:0.306308 26:-0.585397 27:-0.323385 28:-0.511573 29:0.510823 30:-0.8464 31:-0.224808 32:-0.684195 33:0.492988 34:-0.712328 35:-0.255295 36:-0.651922
+-1024 1:-0.527656 2:0.0855326 3:-0.134395 4:0.494448 5:-0.641688 6:0.220568 7:-0.128121 8:0.221748 9:-0.542116 10:0.00423714 11:-0.0488043 12:-0.228941 13:-0.37598 14:-0.387799 15:-0.0480787 16:-0.214269 17:-0.372953 18:-0.387181 19:0.372493 20:0.35218 21:0.399989 22:0.012572 23:-0.176589 24:0.474721 25:0.437415 26:-0.416101 27:-0.00950523 28:0.0890462 29:0.515919 30:-0.245822 31:0.0898294 32:-0.0917271 33:0.531205 34:-0.230604 35:0.0810641 36:-0.0912301
+-1024 1:-0.979094 2:-0.733505 3:-0.910892 4:-0.402267 5:-0.800128 6:-0.841493 7:-0.896677 8:-0.509913 9:-0.708834 10:-0.85455 11:-0.879576 12:-0.498145 13:-0.736571 14:-0.940477 15:-0.882535 16:-0.474462 17:-0.737493 18:-0.941924 19:-0.852636 20:-0.556573 21:-0.616473 22:-0.235039 23:-0.604402 24:-0.635791 25:-0.637666 26:-0.207799 27:-0.583042 28:-0.627677 29:-0.492996 30:-0.578365 31:-0.566856 32:-0.892251 33:-0.490448 34:-0.607825 35:-0.568481 36:-0.900553
+286.3450987087393 1:-0.775416 2:-0.353881 3:-0.51345 4:0.34444 5:-0.84437 6:-0.324884 7:-0.496121 8:-0.07578 9:-0.74828 10:-0.570208 11:-0.469525 12:-0.25353 13:-0.662485 14:-0.731114 15:-0.479799 16:0.000640807 17:-0.706657 18:-0.683405 19:-0.166368 20:-0.144192 21:-0.0682416 22:0.350469 23:-0.661044 24:-0.0110984 25:-0.00834928 26:-0.391687 27:-0.423868 28:-0.498777 29:0.0242015 30:-0.324346 31:-0.383084 32:-0.607514 33:-0.00637204 34:-0.157138 35:-0.419845 36:-0.558271
+-1024 1:-0.697792 2:-0.268329 3:-0.401713 4:0.795801 5:-0.889393 6:-0.0469431 7:-0.363074 8:-0.495879 9:-0.610372 10:-0.640027 11:-0.330808 12:-0.401352 13:-0.592444 14:-0.709252 15:-0.3364 16:-0.456656 17:-0.576206 18:-0.724846 19:0.00160853 20:-0.0593679 21:0.0894041 22:0.72708 23:-0.731651 24:0.317135 25:0.108455 26:-0.768391 27:-0.241352 28:-0.595703 29:0.202545 30:-0.509838 31:-0.262459 32:-0.584392 33:0.192354 34:-0.571386 35:-0.244016 36:-0.599751
+-1024 1:-0.635281 2:-0.427117 3:-0.415857 4:0.144188 5:-0.848622 6:-0.52654 7:-0.377228 8:0.0906808 9:-0.834677 10:-0.618597 11:-0.298798 12:-0.0869312 13:-0.781278 14:-0.790434 15:-0.304365 16:-0.103571 17:-0.776968 18:-0.797575 19:0.143344 20:-0.222669 21:-0.00471179 22:0.268033 23:-0.707486 24:-0.228102 25:0.0464758 26:-0.0249336 27:-0.621436 28:-0.443891 29:0.212736 30:-0.158937 31:-0.555941 32:-0.677548 33:0.20764 34:-0.200195 35:-0.546511 36:-0.686247
+1024 1:-0.988202 2:-0.691977 3:-0.933523 4:-0.563082 5:-0.66376 6:-0.838195 7:-0.926399 8:-0.523791 9:-0.670004 10:-0.844416 11:-0.907015 12:-0.618641 13:-0.66323 14:-0.945114 15:-0.911521 16:-0.569079 17:-0.664677 18:-0.940587 19:-0.930065 20:-0.552423 21:-0.734118 22:-0.230181 23:-0.552615 24:-0.591252 25:-0.733016 26:-0.262094 27:-0.524792 28:-0.636085 29:-0.630575 30:-0.557348 31:-0.551078 32:-0.896821 33:-0.633122 34:-0.598296 35:-0.509431 36:-0.88777
+1024 1:-0.563462 2:-0.183547 3:-0.209357 4:0.510709 5:-0.810906 6:-0.156161 7:-0.200305 8:0.176484 9:-0.736201 10:-0.387142 11:-0.115875 12:0.0302109 13:-0.657778 14:-0.597902 15:-0.116726 16:-0.0121668 17:-0.650619 18:-0.614456 19:0.26947 20:0.0744746 21:0.261167 22:0.376135 23:-0.555583 24:0.240867 25:0.315843 26:-0.30169 27:-0.334232 28:-0.26006 29:0.383436 30:-0.0420129 31:-0.29492 32:-0.356196 33:0.411462 34:-0.240253 35:-0.246383 36:-0.41995
+1024 1:-0.985511 2:-0.702939 3:-0.922208 4:-0.479311 5:-0.717316 6:-0.816761 7:-0.927814 8:-0.440873 9:-0.691599 10:-0.80703 11:-0.902442 12:-0.540781 13:-0.68574 14:-0.932435 15:-0.905418 16:-0.523607 17:-0.681513 18:-0.93339 19:-0.873867 20:-0.546408 21:-0.621179 22:-0.368625 23:-0.485346 24:-0.628939 25:-0.685343 26:-0.182874 27:-0.562201 28:-0.585029 29:-0.518472 30:-0.727116 31:-0.490273 32:-0.901357 33:-0.523567 34:-0.735247 35:-0.482279 36:-0.899602
+-1024 1:-0.94163 2:-0.60033 3:-0.796328 4:-0.342433 5:-0.654695 6:-0.58631 7:-0.743814 8:-0.705336 9:-0.58315 10:-0.799592 11:-0.695129 12:-0.848642 13:-0.575499 14:-0.900248 15:-0.696422 16:-0.798368 17:-0.582238 18:-0.898527 19:-0.68654 20:-0.415012 21:-0.411768 22:-0.0440326 23:-0.527903 24:-0.262391 25:-0.353996 26:-0.45659 27:-0.472448 28:-0.662957 29:-0.194906 30:-0.888602 31:-0.414149 32:-0.871434 33:-0.210193 34:-0.839251 35:-0.421402 36:-0.861061
+-1024 1:-0.941423 2:-0.681637 3:-0.824615 4:-0.238522 5:-0.858821 6:-0.790307 7:-0.823076 8:-0.219196 9:-0.823509 10:-0.781201 11:-0.778969 12:-0.248377 13:-0.8172 14:-0.904194 15:-0.781852 16:-0.218293 17:-0.820303 18:-0.904643 19:-0.690287 20:-0.46841 21:-0.451769 22:-0.14071 23:-0.653562 24:-0.583973 25:-0.489873 26:-0.0693675 27:-0.625554 28:-0.524454 29:-0.335033 30:-0.339337 31:-0.619454 32:-0.830065 33:-0.332485 34:-0.424994 35:-0.59948 36:-0.844859
+1024 1:-0.67171 2:-0.224806 3:-0.363524 4:0.151875 5:-0.745488 6:-0.310617 7:-0.350337 8:0.366416 9:-0.760132 10:-0.272286 11:-0.288128 12:-0.0850371 13:-0.635826 14:-0.624003 15:-0.295212 16:-0.0287167 17:-0.640375 18:-0.61295 19:0.0971395 20:0.101767 21:0.192934 22:-0.0477076 23:-0.388642 24:0.00211326 25:0.194272 26:0.0901628 27:-0.387715 28:0.0984554 29:0.248405 30:-0.136253 31:-0.210451 32:-0.330997 33:0.256048 34:-0.0723775 35:-0.236133 36:-0.318552
+-1024 1:-0.611479 2:-0.200716 3:-0.280076 4:0.138837 5:-0.750604 6:-0.342361 7:-0.292305 8:0.371243 9:-0.75779 10:-0.272085 11:-0.240872 12:-0.105595 13:-0.618913 14:-0.611977 15:-0.240293 16:-0.0919942 17:-0.621184 18:-0.616919 19:0.138973 20:0.0481336 21:0.209405 22:-0.126799 23:-0.419572 24:-0.148817 25:0.0965357 26:0.181666 27:-0.452309 28:0.0950483 29:0.286622 30:-0.194002 31:-0.277509 32:-0.42872 33:0.271335 34:-0.140668 35:-0.288873 36:-0.410967
+1024 1:-0.823647 2:-0.364782 3:-0.578511 4:0.0365183 5:-0.767728 6:-0.427376 7:-0.568306 8:0.0980601 9:-0.739463 10:-0.419546 11:-0.469525 12:-0.148517 13:-0.703396 14:-0.732875 15:-0.46912 16:-0.173026 17:-0.692776 18:-0.739109 19:-0.243176 20:-0.0265221 21:0.0235237 22:-0.155442 23:-0.38712 24:-0.139887 25:-0.0155007 26:-0.00737473 27:-0.369825 28:0.000981266 29:0.136302 30:-0.411288 31:-0.262806 32:-0.540146 33:0.0904442 34:-0.288011 35:-0.279073 36:-0.492392
+1024 1:-0.97599 2:-0.741459 3:-0.906651 4:-0.388098 5:-0.812363 6:-0.844049 7:-0.89243 8:-0.486712 9:-0.730405 10:-0.858265 11:-0.876527 12:-0.457247 13:-0.754349 14:-0.938543 15:-0.879484 16:-0.432673 17:-0.75619 18:-0.940129 19:-0.843271 20:-0.548785 21:-0.618826 22:-0.212265 23:-0.608878 24:-0.618212 25:-0.637666 26:-0.215632 27:-0.569712 28:-0.618253 29:-0.498092 30:-0.546997 31:-0.563153 32:-0.881011 33:-0.480257 34:-0.591129 35:-0.565416 36:-0.892753
+999.3938526293658 1:-0.873946 2:-0.415795 3:-0.647817 4:0.202283 5:-0.798589 6:-0.311723 7:-0.58529 8:-0.429296 9:-0.668436 10:-0.690774 11:-0.533548 12:-0.403559 13:-0.668562 14:-0.798277 15:-0.536244 16:-0.401957 17:-0.663829 18:-0.802291 19:-0.436135 20:-0.152219 21:-0.169416 22:0.409758 23:-0.611865 24:0.206116 25:-0.0703282 26:-0.702398 27:-0.245097 28:-0.57437 29:0.0420361 30:-0.74317 31:-0.269806 32:-0.708259 33:0.0318449 34:-0.811881 35:-0.244801 36:-0.720448
+656.6192720771991 1:-0.461631 2:-0.0571855 3:-0.138637 4:0.27718 5:-0.654833 6:-0.036141 7:-0.152181 8:0.239289 9:-0.593992 10:-0.0739274 11:0.0426565 12:-0.630692 13:-0.429698 14:-0.621105 15:0.0510795 16:-0.412269 17:-0.473837 18:-0.584357 19:0.601643 20:0.128185 21:0.418813 22:0.0237406 23:-0.32958 24:0.208289 25:0.315843 26:0.132512 27:-0.274496 28:0.365883 29:0.747766 30:-0.933253 31:-0.0158228 32:-0.558968 33:0.737574 34:-0.704049 35:-0.080675 36:-0.500656
+-1024 1:-0.998137 2:-0.750941 3:-0.971712 4:-0.441662 5:-0.779606 6:-0.870879 7:-0.96603 8:-0.49684 9:-0.699605 10:-0.875355 11:-0.958842 12:-0.493496 13:-0.711642 14:-0.953381 15:-0.960336 16:-0.450205 17:-0.724134 18:-0.954058 19:-0.981268 20:-0.628803 21:-0.823531 22:-0.202049 23:-0.650134 24:-0.712501 25:-0.852207 26:-0.185183 27:-0.605856 28:-0.692679 29:-0.76051 30:-0.465428 31:-0.60417 32:-0.917322 33:-0.757962 34:-0.450235 35:-0.631941 36:-0.924536
+-1024 1:-0.985511 2:-0.702939 3:-0.922208 4:-0.479311 5:-0.717316 6:-0.816761 7:-0.927814 8:-0.440873 9:-0.691599 10:-0.80703 11:-0.902442 12:-0.540781 13:-0.68574 14:-0.932435 15:-0.905418 16:-0.523607 17:-0.681513 18:-0.93339 19:-0.873867 20:-0.546408 21:-0.621179 22:-0.368625 23:-0.485346 24:-0.628939 25:-0.685343 26:-0.182874 27:-0.562201 28:-0.585029 29:-0.518472 30:-0.727116 31:-0.490273 32:-0.901357 33:-0.523567 34:-0.735247 35:-0.482279 36:-0.899602
+-242.6418124185815 1:-0.975162 2:-0.555636 3:-0.872705 4:-0.610327 5:-0.50824 6:-0.697117 7:-0.891015 8:-0.322716 9:-0.623584 10:-0.613881 11:-0.842991 12:-0.715947 13:-0.525614 14:-0.877214 15:-0.845923 16:-0.715499 17:-0.516708 18:-0.882927 19:-0.873867 20:-0.400413 21:-0.592943 22:-0.419909 23:-0.289717 24:-0.443976 25:-0.651969 26:-0.211968 27:-0.374217 28:-0.353878 29:-0.505733 30:-0.955856 31:-0.221327 32:-0.845256 33:-0.503187 34:-0.990374 35:-0.214629 36:-0.852756
+-1024 1:-0.889884 2:-0.382324 3:-0.646403 4:-0.381426 5:-0.581891 6:-0.506179 7:-0.668799 8:-0.0204759 9:-0.654262 10:-0.36151 11:-0.545743 12:-0.618363 13:-0.560215 14:-0.785846 15:-0.55455 16:-0.504503 17:-0.580078 18:-0.770585 19:-0.474226 20:-0.109953 21:-0.131771 22:-0.275985 23:-0.268535 24:-0.122067 25:-0.218122 26:0.0997486 27:-0.361784 28:0.155289 29:0.126111 30:-0.978347 31:-0.155892 32:-0.722012 33:0.0828008 34:-0.753507 35:-0.204932 36:-0.663443
+1024 1:-0.558495 2:-0.113027 3:-0.217843 4:0.203335 5:-0.656919 6:-0.103018 7:-0.224367 8:0.253425 9:-0.628016 10:-0.118777 11:-0.0350849 12:-0.569376 13:-0.483049 14:-0.651196 15:-0.0389255 16:-0.423684 17:-0.50361 18:-0.623166 19:0.428064 20:0.104032 21:0.362344 22:0.0381687 23:-0.351805 24:0.186329 25:0.258632 26:0.141822 27:-0.295969 28:0.338592 29:0.666239 30:-0.954551 31:-0.029387 32:-0.583483 33:0.648404 34:-0.683902 35:-0.105781 36:-0.515548
+1024 1:-0.999172 2:-0.716897 3:-0.978784 4:-0.379036 5:-0.768363 6:-0.809331 7:-0.94763 8:-0.617888 9:-0.643731 10:-0.89595 11:-0.952744 12:-0.566126 13:-0.66442 14:-0.949878 15:-0.95576 16:-0.537628 17:-0.664424 18:-0.949819 19:-0.977521 20:-0.601904 21:-0.816472 22:-0.144113 23:-0.595488 24:-0.565627 25:-0.809298 26:-0.32094 27:-0.55117 28:-0.752153 29:-0.737579 30:-0.600396 31:-0.532605 32:-0.920987 33:-0.737579 34:-0.606623 35:-0.530906 36:-0.92072
+153.2749903514445 1:-0.959845 2:-0.563448 3:-0.857146 4:-0.167255 5:-0.790895 6:-0.619845 7:-0.857045 8:-0.166735 9:-0.783092 10:-0.68652 11:-0.807931 12:-0.257683 13:-0.752861 14:-0.857023 15:-0.813886 16:-0.226034 17:-0.756817 18:-0.857687 19:-0.878862 20:-0.319548 21:-0.661179 22:-0.134271 23:-0.488388 24:-0.35282 25:-0.694878 26:-0.106952 27:-0.535612 28:-0.461041 29:-0.622931 30:-0.326876 31:-0.451101 32:-0.734926 33:-0.628027 34:-0.360574 35:-0.446916 36:-0.74849
+1024 1:-0.981992 2:-0.602435 3:-0.91655 4:-0.370346 5:-0.703082 6:-0.70116 7:-0.909414 8:-0.416189 9:-0.642766 10:-0.728027 11:-0.878052 12:-0.497467 13:-0.653581 14:-0.890149 15:-0.88101 16:-0.4821 17:-0.650508 18:-0.892385 19:-0.883858 20:-0.440998 21:-0.661179 22:-0.206992 23:-0.482275 24:-0.444066 25:-0.692494 26:-0.14543 27:-0.518505 28:-0.483167 29:-0.561784 30:-0.523355 31:-0.490748 32:-0.839165 33:-0.554141 34:-0.564235 35:-0.483862 36:-0.848053
+-1024 1:-0.903338 2:-0.532988 3:-0.724194 4:-0.250982 5:-0.741402 6:-0.622789 7:-0.722583 8:-0.163695 9:-0.732838 10:-0.606502 11:-0.63568 12:-0.362113 13:-0.70938 14:-0.831234 15:-0.638453 16:-0.347206 17:-0.706468 18:-0.831965 19:-0.577886 20:-0.168882 21:-0.301182 22:-0.120801 23:-0.400307 24:-0.141809 25:-0.318241 26:-0.0782228 27:-0.358382 28:-0.091721 29:-0.0955449 30:-0.560216 31:-0.31908 32:-0.676992 33:-0.113377 34:-0.540002 35:-0.319948 36:-0.669689
+1024 1:-0.706693 2:-0.350799 3:-0.444145 4:0.133625 5:-0.812838 6:-0.45442 7:-0.439506 8:0.158198 9:-0.794652 10:-0.492497 11:-0.394832 12:-0.0399076 13:-0.717047 14:-0.702531 15:-0.398947 16:-0.115029 17:-0.700094 18:-0.720757 19:-0.059585 20:-0.124736 21:0.00470003 22:-0.113606 23:-0.527782 24:-0.322857 25:-0.00119786 26:-0.168894 27:-0.469909 28:-0.347343 29:0.0445839 30:-0.0458956 31:-0.42873 32:-0.505619 33:0.0471317 34:-0.159693 35:-0.399811 36:-0.538055
+1024 1:-0.958603 2:-0.590284 3:-0.850074 4:-0.278512 5:-0.771314 6:-0.694699 7:-0.844307 8:-0.291492 9:-0.735272 10:-0.717395 11:-0.804882 12:-0.364068 13:-0.720699 14:-0.868767 15:-0.807784 16:-0.339551 17:-0.722828 18:-0.870459 19:-0.808302 20:-0.366355 21:-0.557649 22:-0.082449 23:-0.57049 24:-0.397078 25:-0.57092 26:-0.076554 27:-0.554425 28:-0.431558 29:-0.439492 30:-0.373605 31:-0.513243 32:-0.776018 33:-0.426753 34:-0.467028 35:-0.49801 36:-0.801315
+1024 1:-0.476326 2:-0.00657396 3:-0.0877193 4:0.246646 5:-0.666964 6:-0.0965739 7:-0.108305 8:0.739118 9:-0.730223 10:0.0817469 11:-0.00917062 12:-0.0797021 13:-0.497797 14:-0.452988 15:-0.00841485 16:-0.205665 17:-0.47494 18:-0.49575 19:0.490502 20:0.270139 21:0.472931 22:-0.181725 23:-0.204292 24:0.15223 25:0.38259 26:0.208012 27:-0.277962 28:0.46526 29:0.566875 30:-0.289072 31:0.00734034 32:-0.198372 33:0.607639 34:-0.488609 35:0.0512591 36:-0.279318
+532.9936258716319 1:-0.616654 2:-0.226486 3:-0.247544 4:0.768588 5:-0.852517 6:-0.0106526 7:-0.225782 8:-0.452808 9:-0.609157 10:-0.604354 11:-0.118924 12:-0.631648 13:-0.562611 14:-0.750235 15:-0.128931 16:-0.433254 17:-0.593341 18:-0.718598 19:0.259479 20:-0.0398227 21:0.312932 22:0.884208 23:-0.757694 24:0.42459 25:0.296773 26:-0.58048 27:-0.322193 28:-0.506976 29:0.467513 30:-0.855833 31:-0.215902 32:-0.684529 33:0.4828 34:-0.698333 35:-0.2614 36:-0.651128
+-865.5171239135595 1:-0.901682 2:-0.369461 3:-0.695906 4:0.105183 5:-0.729933 6:-0.252671 7:-0.647568 8:-0.352294 9:-0.636996 10:-0.608017 11:-0.591473 12:-0.429712 13:-0.615103 14:-0.767554 15:-0.595738 16:-0.344429 17:-0.630555 18:-0.757535 19:-0.571018 20:-0.103366 21:-0.280005 22:0.261631 23:-0.461659 24:0.325784 25:-0.184748 26:-0.721929 27:-0.0985898 28:-0.463078 29:-0.0802581 30:-0.926607 31:-0.0904967 32:-0.683267 33:-0.0853537 34:-0.831802 35:-0.122875 36:-0.661466
+970.8737008594042 1:-0.949082 2:-0.620612 3:-0.8133 4:-0.378818 5:-0.713983 6:-0.699318 7:-0.808921 8:-0.417536 9:-0.690979 10:-0.747295 11:-0.743909 12:-0.631031 13:-0.663509 14:-0.901239 15:-0.746764 16:-0.588072 17:-0.667478 18:-0.899544 19:-0.793317 20:-0.410943 21:-0.538826 22:-0.150446 23:-0.525296 24:-0.409817 25:-0.549466 26:-0.165225 27:-0.522756 28:-0.482903 29:-0.383441 30:-0.627197 31:-0.474437 32:-0.841895 33:-0.385989 34:-0.576787 35:-0.490024 36:-0.832787
+-1024 1:-0.985304 2:-0.69591 3:-0.920793 4:-0.443742 5:-0.7577 6:-0.829743 7:-0.927814 8:-0.40862 9:-0.71997 10:-0.811781 11:-0.893296 12:-0.497467 13:-0.717506 14:-0.933854 15:-0.896265 16:-0.465883 17:-0.718329 18:-0.932981 19:-0.944426 20:-0.525641 21:-0.760001 22:-0.206385 23:-0.533156 24:-0.543319 25:-0.775925 26:-0.187476 27:-0.523879 28:-0.565015 29:-0.648409 30:-0.637669 31:-0.480921 32:-0.890613 33:-0.666244 34:-0.620175 35:-0.470209 36:-0.881367
+-1024 1:-0.861734 2:-0.660261 3:-0.701563 4:-0.0651613 5:-0.912114 6:-0.773228 7:-0.694275 8:-0.00165526 9:-0.905463 10:-0.784073 11:-0.653972 12:0.0851066 13:-0.893443 14:-0.87382 15:-0.653708 16:0.0118763 17:-0.881776 18:-0.887102 19:-0.347461 20:-0.496869 21:-0.223534 22:0.195664 23:-0.847088 24:-0.605287 25:-0.230041 26:0.168377 27:-0.82124 28:-0.633579 29:-0.217837 30:0.234794 31:-0.786918 32:-0.757406 33:-0.20255 34:0.0616958 35:-0.755478 36:-0.790797
+-1024 1:-0.896093 2:-0.431611 3:-0.691662 4:-0.173135 5:-0.690923 6:-0.474947 7:-0.678707 8:-0.213258 9:-0.644915 10:-0.515005 11:-0.585375 12:-0.453293 13:-0.632625 14:-0.790945 15:-0.58811 16:-0.42459 17:-0.633643 18:-0.790589 19:-0.433636 20:-0.116861 21:-0.171769 22:-0.187957 23:-0.272419 24:-0.0135128 25:-0.220505 26:0.164157 27:-0.479105 28:0.0252788 29:0.0777053 30:-0.642494 31:-0.285635 32:-0.67312 33:0.0853486 34:-0.641853 35:-0.292427 36:-0.676852
+1024 1:-0.955084 2:-0.481854 3:-0.845831 4:-0.0626558 5:-0.760918 6:-0.455343 7:-0.825906 8:-0.217354 9:-0.72654 10:-0.644435 11:-0.791164 12:-0.253017 13:-0.703978 14:-0.80844 15:-0.794056 16:-0.221377 17:-0.705777 18:-0.806207 19:-0.839524 20:-0.206493 21:-0.61412 22:0.127864 23:-0.503708 24:0.0425313 25:-0.580455 26:-0.333183 27:-0.332324 28:-0.42836 29:-0.503187 30:-0.56016 31:-0.286389 32:-0.711196 33:-0.510828 34:-0.467012 35:-0.319479 36:-0.687625
+-1024 1:-0.923416 2:-0.512037 3:-0.755311 4:-0.267608 5:-0.712989 6:-0.595882 7:-0.746645 8:-0.303726 9:-0.663892 10:-0.619209 11:-0.679885 12:-0.405245 13:-0.671793 14:-0.820575 15:-0.681168 16:-0.439542 17:-0.658514 18:-0.831106 19:-0.566022 20:-0.228174 21:-0.280005 22:-0.246106 23:-0.339685 24:-0.210849 25:-0.277717 26:-0.241304 27:-0.347224 28:-0.273489 29:-0.0980927 30:-0.396949 31:-0.401435 32:-0.66624 33:-0.0777103 34:-0.633944 35:-0.338307 36:-0.720935
+1024 1:-0.940802 2:-0.579741 3:-0.796328 4:-0.284508 5:-0.764271 6:-0.68658 7:-0.787692 8:-0.332221 9:-0.713162 10:-0.709743 11:-0.733238 12:-0.454049 13:-0.706662 14:-0.874327 15:-0.736086 16:-0.420984 17:-0.709449 18:-0.874086 19:-0.606611 20:-0.355366 21:-0.334121 22:-0.138878 23:-0.527925 24:-0.374996 25:-0.344463 26:-0.090528 27:-0.538234 28:-0.397288 29:-0.108281 30:-0.568795 31:-0.490494 32:-0.803774 33:-0.108281 34:-0.566968 35:-0.491249 36:-0.802952
+-100.7448683774241 1:-0.559736 2:-0.0323976 3:-0.203699 4:0.523018 5:-0.706488 6:0.115674 7:-0.200305 8:0.251597 9:-0.622535 10:-0.112383 11:-0.0716699 12:-0.103683 13:-0.517787 14:-0.48935 15:-0.0724873 16:-0.102521 17:-0.513233 18:-0.492367 19:0.347518 20:0.331081 21:0.39058 22:0.12348 23:-0.243096 24:0.518101 25:0.418345 26:-0.369043 27:-0.0460063 28:0.0920248 29:0.487895 30:-0.257495 31:0.0741862 32:-0.116317 33:0.490441 34:-0.211731 35:0.0595924 36:-0.101542
+251.4092755409974 1:-0.903131 2:-0.49307 3:-0.704392 4:-0.300296 5:-0.647237 6:-0.528005 7:-0.692859 8:-0.393307 9:-0.619071 10:-0.621388 11:-0.588424 12:-0.659496 13:-0.601988 14:-0.83869 15:-0.595738 16:-0.607903 17:-0.605228 18:-0.834222 19:-0.491711 20:-0.260555 21:-0.240005 22:-0.122234 23:-0.435028 24:-0.19556 25:-0.256263 26:-0.0901291 27:-0.483798 28:-0.303206 29:-0.0216588 30:-0.656293 31:-0.379099 32:-0.753662 33:-0.0471368 34:-0.556754 35:-0.400927 36:-0.730846
+1024 1:-0.903131 2:-0.49307 3:-0.704392 4:-0.300296 5:-0.647237 6:-0.528005 7:-0.692859 8:-0.393307 9:-0.619071 10:-0.621388 11:-0.588424 12:-0.659496 13:-0.601988 14:-0.83869 15:-0.595738 16:-0.607903 17:-0.605228 18:-0.834222 19:-0.491711 20:-0.260555 21:-0.240005 22:-0.122234 23:-0.435028 24:-0.19556 25:-0.256263 26:-0.0901291 27:-0.483798 28:-0.303206 29:-0.0216588 30:-0.656293 31:-0.379099 32:-0.753662 33:-0.0471368 34:-0.556754 35:-0.400927 36:-0.730846
+-1024 1:-0.906857 2:-0.51699 3:-0.728437 4:-0.24532 5:-0.727479 6:-0.597138 7:-0.725414 8:-0.25616 9:-0.687332 10:-0.611968 11:-0.650923 12:-0.349775 13:-0.692144 14:-0.81367 15:-0.649132 16:-0.383706 17:-0.681844 18:-0.825234 19:-0.517314 20:-0.19246 21:-0.247064 22:-0.196058 23:-0.357392 24:-0.168107 25:-0.256263 26:-0.217812 27:-0.33297 28:-0.221612 29:-0.0751625 30:-0.339813 31:-0.389808 32:-0.627706 33:-0.0394934 34:-0.601496 35:-0.323599 36:-0.692908
+1024 1:-0.880777 2:-0.347925 3:-0.637916 4:-0.229594 5:-0.658978 6:-0.47899 7:-0.651814 8:0.0567901 9:-0.667443 10:-0.3139 11:-0.548792 12:-0.346995 13:-0.621271 14:-0.734947 15:-0.551499 16:-0.345625 17:-0.6158 18:-0.73888 19:-0.330601 20:-0.0299582 21:-0.047065 22:-0.283106 23:-0.221082 24:-0.0489163 25:-0.106085 26:0.0314009 27:-0.273221 28:0.21938 29:0.187258 30:-0.677084 31:-0.16459 32:-0.589285 33:0.187258 34:-0.722091 35:-0.152504 36:-0.601631
+-1024 1:-0.795495 2:-0.353739 3:-0.509207 4:0.464029 5:-0.843604 6:-0.205275 7:-0.456491 8:-0.366477 9:-0.684426 10:-0.661957 11:-0.402454 12:-0.301083 13:-0.676423 14:-0.758405 15:-0.403524 16:-0.332288 17:-0.666378 18:-0.768171 19:-0.152629 20:-0.0835789 21:0.039992 22:0.580845 23:-0.672574 24:0.275738 25:0.0917681 26:-0.774055 27:-0.225245 28:-0.586518 29:0.171971 30:-0.615919 31:-0.260229 32:-0.634624 33:0.154137 34:-0.703686 35:-0.228404 36:-0.651701
+1024 1:-0.937698 2:-0.584791 3:-0.769455 4:-0.406232 5:-0.662642 6:-0.649914 7:-0.779199 8:-0.360066 9:-0.678142 10:-0.685009 11:-0.696653 12:-0.641319 13:-0.637832 14:-0.877763 15:-0.699473 16:-0.613209 17:-0.637225 18:-0.877581 19:-0.712766 20:-0.336463 21:-0.432945 22:-0.155287 23:-0.460441 24:-0.299862 25:-0.477954 26:0.00360449 27:-0.541548 28:-0.303759 29:-0.27134 30:-0.643195 31:-0.403927 32:-0.790367 33:-0.266245 34:-0.644344 35:-0.398087 36:-0.78563
+1024 1:-0.986753 2:-0.750316 3:-0.933523 4:-0.412259 5:-0.790853 6:-0.845266 7:-0.922153 8:-0.481513 9:-0.711106 10:-0.850583 11:-0.911588 12:-0.48718 13:-0.725548 14:-0.941168 15:-0.914572 16:-0.460988 17:-0.724799 18:-0.941 19:-0.893224 20:-0.546183 21:-0.661179 22:-0.260204 23:-0.58677 24:-0.649434 25:-0.673424 26:-0.257408 27:-0.532439 28:-0.622631 29:-0.577071 30:-0.612826 31:-0.514328 32:-0.886918 33:-0.579619 34:-0.659234 35:-0.499409 36:-0.893397
+1024 1:-0.981992 2:-0.602435 3:-0.91655 4:-0.370346 5:-0.703082 6:-0.70116 7:-0.909414 8:-0.416189 9:-0.642766 10:-0.728027 11:-0.878052 12:-0.497467 13:-0.653581 14:-0.890149 15:-0.88101 16:-0.4821 17:-0.650508 18:-0.892385 19:-0.883858 20:-0.440998 21:-0.661179 22:-0.206992 23:-0.482275 24:-0.444066 25:-0.692494 26:-0.14543 27:-0.518505 28:-0.483167 29:-0.561784 30:-0.523355 31:-0.490748 32:-0.839165 33:-0.554141 34:-0.564235 35:-0.483862 36:-0.848053
+-1024 1:-0.684958 2:-0.114062 3:-0.338066 4:0.161964 5:-0.652482 6:-0.118922 7:-0.354583 8:0.422702 9:-0.708836 10:-0.111869 11:-0.30642 12:-0.225604 13:-0.491557 14:-0.527014 15:-0.305891 16:-0.325316 17:-0.460672 18:-0.552475 19:0.0540569 20:0.164375 21:0.185875 22:-0.0729101 23:-0.201134 24:0.297222 25:0.153747 26:-0.106463 27:-0.260522 28:0.0673936 29:0.245857 30:-0.350333 31:-0.0546644 32:-0.312675 33:0.233118 34:-0.372964 35:-0.0302408 36:-0.305929
+-1024 1:-0.953635 2:-0.611957 3:-0.830273 4:-0.339727 5:-0.739793 6:-0.702476 7:-0.811752 8:-0.444799 9:-0.681441 10:-0.755975 11:-0.768299 12:-0.551173 13:-0.675085 14:-0.890152 15:-0.772698 16:-0.502905 17:-0.684033 18:-0.889899 19:-0.730874 20:-0.37398 21:-0.46118 22:-0.166504 23:-0.46204 24:-0.32096 25:-0.446965 26:-0.213367 27:-0.487588 28:-0.468403 29:-0.261149 30:-0.649276 31:-0.435047 32:-0.811266 33:-0.276436 34:-0.580995 35:-0.470492 36:-0.809605
+1024 1:-0.699448 2:-0.267458 3:-0.387569 4:0.23868 5:-0.750068 6:-0.235694 7:-0.391382 8:0.211471 9:-0.729185 10:-0.329851 11:-0.225628 12:-0.0667124 13:-0.680676 14:-0.666792 15:-0.23114 16:-0.0647129 17:-0.676868 18:-0.66921 19:0.234504 20:0.175175 21:0.317638 22:0.21448 23:-0.346211 24:0.453041 25:0.318227 26:0.205096 27:-0.371369 28:0.279945 29:0.487895 30:-0.455021 31:-0.0802433 32:-0.38266 33:0.495536 34:-0.48016 35:-0.0772722 36:-0.394383
+-1024 1:-1 2:-0.741312 3:-0.984442 4:-0.457358 5:-0.727021 6:-0.840493 7:-0.973107 8:-0.566341 9:-0.650172 10:-0.888075 11:-0.963415 12:-0.577725 13:-0.655066 14:-0.954079 15:-0.966438 16:-0.546813 17:-0.655878 18:-0.953731 19:-1 20:-0.637887 21:-0.849413 22:-0.360886 23:-0.491895 24:-0.721295 25:-0.871277 26:-0.300223 27:-0.589817 28:-0.797496 29:-0.808918 30:-0.566773 31:-0.535986 32:-0.927728 33:-0.811466 34:-0.550966 35:-0.553297 36:-0.929388
+1024 1:-0.596371 2:-0.205877 3:-0.264517 4:0.243806 5:-0.754715 6:-0.260657 7:-0.271074 8:0.30202 9:-0.744431 10:-0.302194 11:-0.169228 12:0.0748799 13:-0.661641 14:-0.583905 15:-0.154865 16:0.00158066 17:-0.656479 18:-0.616258 19:0.335031 20:0.107218 21:0.312932 22:0.0497101 23:-0.399743 24:0.111988 25:0.292005 26:-0.0911405 27:-0.326207 28:-0.0154163 29:0.347767 30:0.173259 31:-0.256379 32:-0.181323 33:0.391079 34:-0.217721 35:-0.161647 36:-0.326771
+-1024 1:-0.904166 2:-0.387861 3:-0.676105 4:-0.408251 5:-0.566899 6:-0.514041 7:-0.69569 8:-0.054556 9:-0.639804 10:-0.367026 11:-0.579278 12:-0.651615 13:-0.547964 14:-0.793989 15:-0.586585 16:-0.546557 17:-0.567193 18:-0.781097 19:-0.524807 20:-0.145783 21:-0.174122 22:-0.337239 23:-0.242196 24:-0.168835 25:-0.251495 26:0.0613494 27:-0.363544 28:0.096184 29:0.0573229 30:-0.991614 31:-0.156688 32:-0.735043 33:0.0216537 34:-0.794428 35:-0.199773 36:-0.683993
+1024 1:-0.849521 2:-0.343452 3:-0.586998 4:-0.204928 5:-0.663827 6:-0.464795 7:-0.600859 8:0.12489 9:-0.693722 10:-0.306525 11:-0.486293 12:-0.313309 13:-0.630477 14:-0.724123 15:-0.487427 16:-0.30757 17:-0.627704 18:-0.728232 19:-0.244425 20:0.0286317 21:0.00470003 22:-0.206262 23:-0.233531 24:0.0409903 25:-0.0512578 26:0.0970796 27:-0.27134 28:0.319873 29:0.245857 30:-0.615677 31:-0.145562 32:-0.53944 33:0.261144 34:-0.69618 35:-0.121711 36:-0.559746
+805.9438103198606 1:-0.559115 2:-0.355744 3:-0.287148 4:-0.0128559 5:-0.792355 6:-0.541256 7:-0.282397 8:0.347115 9:-0.847861 10:-0.489189 11:-0.190569 12:-0.242226 13:-0.719966 14:-0.772236 15:-0.189952 16:0.0665926 17:-0.766286 18:-0.724801 19:0.225763 20:-0.177183 21:0.00470003 22:0.100541 23:-0.59971 24:-0.204193 25:0.0989195 26:0.161045 27:-0.640084 28:-0.291879 29:0.30955 30:-0.256182 31:-0.501444 32:-0.666218 33:0.26624 34:-0.0997549 35:-0.522637 36:-0.61823
+1024 1:-0.963778 2:-0.627791 3:-0.88402 4:-0.207087 5:-0.812377 6:-0.691464 7:-0.861292 8:-0.328822 9:-0.746002 10:-0.761629 11:-0.824699 12:-0.328219 13:-0.758333 14:-0.889732 15:-0.829142 16:-0.330972 17:-0.752399 18:-0.894645 19:-0.814547 20:-0.487565 21:-0.588238 22:-0.0771017 23:-0.640775 24:-0.508142 25:-0.580455 26:-0.374201 27:-0.471548 28:-0.640259 29:-0.475162 30:-0.413793 31:-0.570686 32:-0.836724 33:-0.470066 34:-0.578367 35:-0.528178 36:-0.866651
+1024 1:-0.958603 2:-0.590284 3:-0.850074 4:-0.278512 5:-0.771314 6:-0.694699 7:-0.844307 8:-0.291492 9:-0.735272 10:-0.717395 11:-0.804882 12:-0.364068 13:-0.720699 14:-0.868767 15:-0.807784 16:-0.339551 17:-0.722828 18:-0.870459 19:-0.808302 20:-0.366355 21:-0.557649 22:-0.082449 23:-0.57049 24:-0.397078 25:-0.57092 26:-0.076554 27:-0.554425 28:-0.431558 29:-0.439492 30:-0.373605 31:-0.513243 32:-0.776018 33:-0.426753 34:-0.467028 35:-0.49801 36:-0.801315
+-473.3564786810877 1:-0.980336 2:-0.571318 3:-0.91655 4:-0.276834 5:-0.686812 6:-0.577512 7:-0.876861 8:-0.578371 9:-0.575038 10:-0.76988 11:-0.861284 12:-0.601107 13:-0.590922 14:-0.88205 15:-0.864229 16:-0.550923 17:-0.598491 18:-0.878569 19:-0.906961 20:-0.39515 21:-0.694121 22:-0.0879774 23:-0.446349 24:-0.222043 25:-0.67104 26:-0.536417 27:-0.244727 28:-0.59533 29:-0.574523 30:-0.820502 31:-0.28637 32:-0.842368 33:-0.579619 34:-0.779691 35:-0.297914 36:-0.832278
+1024 1:-0.916792 2:-0.51735 3:-0.721365 4:-0.377388 5:-0.684262 6:-0.646893 7:-0.74806 8:-0.229513 9:-0.675026 10:-0.575481 11:-0.637204 12:-0.554492 13:-0.64831 14:-0.846653 15:-0.647606 16:-0.525128 17:-0.647225 18:-0.846401 19:-0.519812 20:-0.279266 21:-0.277652 22:-0.125345 23:-0.478012 24:-0.273098 25:-0.284868 26:-0.1339 27:-0.421009 28:-0.258118 29:-0.0140154 30:-0.719962 31:-0.374842 32:-0.774456 33:-0.00891984 34:-0.785117 35:-0.360277 36:-0.788008
+-1024 1:-0.711454 2:-0.189944 3:-0.360695 4:0.419854 5:-0.770137 6:-0.108196 7:-0.339013 8:-0.035424 9:-0.662582 10:-0.406278 11:-0.253067 12:-0.272115 13:-0.590616 14:-0.651347 15:-0.260125 16:-0.264935 17:-0.58625 18:-0.653126 19:-0.117036 20:-0.0250808 21:0.0329355 22:0.287026 23:-0.544551 24:0.155568 25:0.0560111 26:-0.427889 27:-0.257599 28:-0.309717 29:0.182162 30:-0.483585 31:-0.236277 32:-0.55136 33:0.182162 34:-0.531731 35:-0.217449 36:-0.560061
+1024 1:-0.969573 2:-0.620873 3:-0.879777 4:-0.363566 5:-0.713947 6:-0.698775 7:-0.857045 8:-0.515647 9:-0.598688 10:-0.73618 11:-0.835369 12:-0.525228 13:-0.647772 14:-0.881626 15:-0.833718 16:-0.581946 17:-0.619751 18:-0.891649 19:-0.786448 20:-0.405576 21:-0.487063 22:-0.3317 23:-0.422739 24:-0.473436 25:-0.520862 26:-0.293284 27:-0.386573 28:-0.435149 29:-0.368154 30:-0.637661 31:-0.423414 32:-0.812745 33:-0.375797 34:-0.652309 35:-0.405723 36:-0.807509
+1024 1:-0.810399 2:-0.412129 3:-0.554468 4:-0.106247 5:-0.768217 6:-0.553118 7:-0.566891 8:0.319845 9:-0.840691 10:-0.446736 11:-0.504585 12:-0.23239 13:-0.701452 14:-0.76496 15:-0.514886 16:-0.0394993 17:-0.733566 18:-0.732741 19:-0.301876 20:-0.250747 21:-0.162357 22:-0.138084 23:-0.542663 24:-0.383811 25:-0.172829 26:0.150734 27:-0.645613 28:-0.311266 29:-0.100641 30:-0.30051 31:-0.446629 32:-0.659008 33:-0.105736 34:-0.237762 35:-0.463245 36:-0.643651
+1024 1:-0.936249 2:-0.477576 3:-0.80057 4:-0.177368 5:-0.68684 6:-0.465478 7:-0.756553 8:-0.507956 9:-0.567944 10:-0.660622 11:-0.721043 12:-0.541928 13:-0.589598 14:-0.81423 15:-0.722356 16:-0.531468 17:-0.58712 18:-0.818431 19:-0.595996 20:-0.23389 21:-0.329417 22:-0.138308 23:-0.347805 24:-0.0829443 25:-0.270565 26:-0.528517 27:-0.212168 28:-0.414747 29:-0.100641 30:-0.746642 31:-0.276979 32:-0.729241 33:-0.105736 34:-0.772202 35:-0.266092 36:-0.732696
+-1024 1:-0.976611 2:-0.613658 3:-0.895335 4:-0.336893 5:-0.752505 6:-0.726986 7:-0.881107 8:-0.448793 9:-0.65498 10:-0.754289 11:-0.859759 12:-0.48015 13:-0.680251 14:-0.895052 15:-0.862703 16:-0.46121 17:-0.678067 18:-0.896285 19:-0.797687 20:-0.445299 21:-0.524708 22:-0.207019 23:-0.523501 24:-0.474129 25:-0.544698 26:-0.182306 27:-0.522966 28:-0.501203 29:-0.380893 30:-0.553699 31:-0.515711 32:-0.840884 33:-0.370702 34:-0.611331 35:-0.504603 36:-0.851872
+720.8056859514288 1:-0.978473 2:-0.538927 3:-0.900993 4:-0.320771 5:-0.703593 6:-0.647427 7:-0.899508 8:-0.33585 9:-0.635147 10:-0.643592 11:-0.871954 12:-0.426176 13:-0.639695 14:-0.850449 15:-0.874908 16:-0.405255 17:-0.63899 18:-0.85189 19:-0.807054 20:-0.400096 21:-0.555296 22:-0.24879 23:-0.449414 24:-0.425398 25:-0.566152 26:-0.184711 27:-0.45224 28:-0.407988 29:-0.398728 30:-0.632312 31:-0.431726 32:-0.819997 33:-0.39618 34:-0.6449 35:-0.436563 36:-0.826279
+1024 1:-0.976611 2:-0.613658 3:-0.895335 4:-0.336893 5:-0.752505 6:-0.726986 7:-0.881107 8:-0.448793 9:-0.65498 10:-0.754289 11:-0.859759 12:-0.48015 13:-0.680251 14:-0.895052 15:-0.862703 16:-0.46121 17:-0.678067 18:-0.896285 19:-0.797687 20:-0.445299 21:-0.524708 22:-0.207019 23:-0.523501 24:-0.474129 25:-0.544698 26:-0.182306 27:-0.522966 28:-0.501203 29:-0.380893 30:-0.553699 31:-0.515711 32:-0.840884 33:-0.370702 34:-0.611331 35:-0.504603 36:-0.851872
+1024 1:-0.988202 2:-0.691977 3:-0.933523 4:-0.563082 5:-0.66376 6:-0.838195 7:-0.926399 8:-0.523791 9:-0.670004 10:-0.844416 11:-0.907015 12:-0.618641 13:-0.66323 14:-0.945114 15:-0.911521 16:-0.569079 17:-0.664677 18:-0.940587 19:-0.930065 20:-0.552423 21:-0.734118 22:-0.230181 23:-0.552615 24:-0.591252 25:-0.733016 26:-0.262094 27:-0.524792 28:-0.636085 29:-0.630575 30:-0.557348 31:-0.551078 32:-0.896821 33:-0.633122 34:-0.598296 35:-0.509431 36:-0.88777
+1024 1:-0.509028 2:-0.0839281 3:-0.154195 4:0.281425 5:-0.703939 6:-0.132514 7:-0.153596 8:0.70931 9:-0.75818 10:-0.0062316 11:-0.0244143 12:0.0273262 13:-0.588701 14:-0.513044 15:-0.0190936 16:-0.0606118 17:-0.573921 18:-0.542804 19:0.493 20:0.2643 21:0.489402 22:-0.0696027 23:-0.262645 24:0.203315 25:0.394507 26:0.264628 27:-0.307159 28:0.488407 29:0.57197 30:-0.258759 31:-0.0126659 32:-0.198399 33:0.589805 34:-0.466302 35:0.0419287 36:-0.275349
+870.2864902038735 1:-0.562841 2:-0.177002 3:-0.222087 4:0.248087 5:-0.752019 6:-0.256657 7:-0.224367 8:0.325749 9:-0.742804 10:-0.285425 11:-0.172277 12:0.0638886 13:-0.627572 14:-0.545978 15:-0.151814 16:0.0344241 17:-0.632598 18:-0.572861 19:0.361254 20:0.102632 21:0.308227 22:0.00507833 23:-0.379667 24:0.089771 25:0.303924 26:-0.111419 27:-0.324522 28:-0.0361076 29:0.35541 30:0.177061 31:-0.254824 32:-0.176989 33:0.414009 34:-0.22929 35:-0.164934 36:-0.335802
+1024 1:-0.812262 2:-0.26297 3:-0.502135 4:0.140813 5:-0.704256 6:-0.214662 7:-0.51169 8:0.103192 9:-0.691848 10:-0.335025 11:-0.391783 12:-0.285922 13:-0.597229 14:-0.672116 15:-0.39437 16:-0.260842 17:-0.597695 18:-0.671391 19:-0.328103 20:0.110206 21:0.0446979 22:-0.0645324 23:-0.212796 24:0.283273 25:0.0393244 26:-0.187285 27:-0.222698 28:0.0172743 29:0.131206 30:-0.590601 31:0.0208276 32:-0.400637 33:0.121015 34:-0.599852 35:0.0268445 36:-0.402598
+1024 1:-0.940802 2:-0.579741 3:-0.796328 4:-0.284508 5:-0.764271 6:-0.68658 7:-0.787692 8:-0.332221 9:-0.713162 10:-0.709743 11:-0.733238 12:-0.454049 13:-0.706662 14:-0.874327 15:-0.736086 16:-0.420984 17:-0.709449 18:-0.874086 19:-0.606611 20:-0.355366 21:-0.334121 22:-0.138878 23:-0.527925 24:-0.374996 25:-0.344463 26:-0.090528 27:-0.538234 28:-0.397288 29:-0.108281 30:-0.568795 31:-0.490494 32:-0.803774 33:-0.108281 34:-0.566968 35:-0.491249 36:-0.802952
+-1024 1:-0.694273 2:-0.273452 3:-0.441317 4:0.0885318 5:-0.726423 6:-0.319821 7:-0.43526 8:0.176389 9:-0.726405 10:-0.347617 11:-0.356723 12:-0.270151 13:-0.611532 14:-0.678078 15:-0.354707 16:-0.417908 17:-0.578514 18:-0.713553 19:0.125861 20:-0.0128669 21:0.136463 22:0.0303448 23:-0.455114 24:-0.0179201 25:0.0607787 26:-0.0625518 27:-0.373384 28:-0.0701879 29:0.225475 30:-0.319457 31:-0.254378 32:-0.478832 33:0.26624 34:-0.507094 35:-0.224702 36:-0.547899
+1024 1:-0.975783 2:-0.59559 3:-0.906651 4:-0.264738 5:-0.734994 6:-0.632604 7:-0.876861 8:-0.4252 9:-0.683337 10:-0.765078 11:-0.856711 12:-0.461357 13:-0.670626 14:-0.88095 15:-0.858127 16:-0.439858 17:-0.669165 18:-0.881193 19:-0.887604 20:-0.391589 21:-0.684709 22:-0.0801215 23:-0.478552 24:-0.260699 25:-0.659121 26:-0.350883 27:-0.413061 28:-0.573488 29:-0.556688 30:-0.634076 31:-0.400479 32:-0.827846 33:-0.559236 34:-0.630284 35:-0.396087 36:-0.822995
+-1024 1:-0.986753 2:-0.750316 3:-0.933523 4:-0.412259 5:-0.790853 6:-0.845266 7:-0.922153 8:-0.481513 9:-0.711106 10:-0.850583 11:-0.911588 12:-0.48718 13:-0.725548 14:-0.941168 15:-0.914572 16:-0.460988 17:-0.724799 18:-0.941 19:-0.893224 20:-0.546183 21:-0.661179 22:-0.260204 23:-0.58677 24:-0.649434 25:-0.673424 26:-0.257408 27:-0.532439 28:-0.622631 29:-0.577071 30:-0.612826 31:-0.514328 32:-0.886918 33:-0.579619 34:-0.659234 35:-0.499409 36:-0.893397
+1024 1:-0.576915 2:0.123935 3:-0.157024 4:-0.61009 5:-0.294617 6:-0.287372 7:-0.155012 8:-0.546396 9:-0.307511 10:-0.285601 11:-0.143314 12:-0.289084 13:-0.284756 14:-0.322656 15:-0.145711 16:-0.241601 17:-0.300265 18:-0.32508 19:0.315674 20:0.00944101 21:0.341168 22:-0.298594 23:-0.399478 24:-0.302719 25:0.356368 26:-0.317052 27:-0.386917 28:-0.347941 29:0.431844 30:0.112481 31:-0.405143 32:-0.384256 33:0.434392 34:0.0491358 35:-0.394717 36:-0.408956
+846.4104003690569 1:-0.386086 2:0.706333 3:0.125856 4:-0.55067 5:0.317396 6:0.597164 7:0.136561 8:-0.642991 9:0.352368 10:0.4967 11:0.143265 12:-0.410362 13:0.435505 14:0.423267 15:0.142612 16:-0.391353 17:0.433224 18:0.419153 19:0.633486 20:0.718558 21:0.71293 22:-0.445729 23:0.369363 24:0.698396 25:0.744923 26:-0.734408 27:0.492132 28:0.406284 29:0.775791 30:-0.112828 31:0.521803 32:0.463202 33:0.783435 34:-0.0815436 35:0.501319 36:0.463572
+1024 1:-0.522895 2:0.280959 3:-0.06226 4:-0.420724 5:-0.226991 6:-0.0178141 7:-0.0545194 8:-0.407185 9:-0.221818 10:-0.0525877 11:-0.0366093 12:-0.247083 13:-0.157185 14:-0.149215 15:-0.0358745 16:-0.246753 17:-0.1584 18:-0.162429 19:0.345645 20:0.209934 21:0.418813 22:-0.334928 23:-0.224995 24:-0.0771918 25:0.420728 26:-0.3756 27:-0.185438 28:-0.117259 29:0.492988 30:0.0580995 31:-0.189258 32:-0.180009 33:0.487893 34:-0.0183483 35:-0.160145 36:-0.199402
+388.8170839850223 1:1 2:0.993336 3:0.970296 4:-0.980522 5:0.992118 6:0.998312 7:0.975937 8:-0.980814 9:0.983649 10:0.991928 11:0.989329 12:-0.541858 13:0.98566 14:0.985031 15:0.977117 16:-0.525359 17:0.989962 18:0.985223 19:0.893855 20:0.994322 21:0.983529 22:-0.986887 23:0.981007 24:0.749616 25:0.98093 26:-0.954459 27:0.963885 28:0.747406 29:0.964331 30:-0.0270943 31:0.93863 32:0.995801 33:0.982165 34:-0.0702654 35:0.939187 36:0.949601
+-1024 1:-0.499093 2:0.0215853 3:-0.0764041 4:-0.335275 5:-0.485797 6:-0.306027 7:-0.0856583 8:-0.171555 9:-0.514502 10:-0.279296 11:-0.0305118 12:-0.302682 13:-0.420341 14:-0.472541 15:-0.0358745 16:-0.0706938 17:-0.467154 18:-0.42168 19:0.428064 20:0.00327795 21:0.24705 22:-0.0530602 23:-0.441917 24:-0.0969541 25:0.365903 26:-0.0219781 27:-0.485371 28:-0.202679 29:0.54904 30:-0.16874 31:-0.377627 32:-0.50302 33:0.515919 34:-0.0731837 35:-0.389969 36:-0.467867
+-1024 1:-0.482122 2:0.273757 3:-0.0495304 4:-0.220083 5:-0.274244 6:0.11693 7:0.000680103 8:-0.457777 9:-0.229462 10:-0.103082 11:0.0167437 12:-0.365224 13:-0.150622 14:-0.208768 15:0.0114171 16:-0.204793 17:-0.184621 18:-0.164836 19:0.524843 20:0.282452 21:0.501166 22:-0.00784785 23:-0.254827 24:0.302734 25:0.613814 26:-0.554684 27:-0.0966094 28:-0.171811 29:0.671334 30:-0.170633 31:-0.0662492 32:-0.192428 33:0.645856 34:-0.0868643 35:-0.0841582 36:-0.159928
+-1024 1:-0.281358 2:0.7912 3:0.231941 4:-0.813693 5:0.519888 6:0.556451 7:0.229982 8:-0.760796 9:0.506468 10:0.578323 11:0.239303 12:-0.464711 13:0.563376 14:0.532424 15:0.231096 16:-0.418917 17:0.558267 18:0.545195 19:0.700295 20:0.783247 21:0.797636 22:-0.708306 23:0.540108 24:0.556641 25:0.790216 26:-0.685929 27:0.560414 28:0.570634 29:0.829295 30:-0.0849481 31:0.596948 32:0.568631 33:0.834391 34:0.00313598 35:0.562004 36:0.602152
+1024 1:-0.542144 2:0.033568 3:-0.148537 4:-0.154289 5:-0.527619 6:-0.213098 7:-0.152181 8:-0.0106526 9:-0.542024 10:-0.194806 11:-0.115875 12:-0.119879 13:-0.429099 14:-0.394805 15:-0.115201 16:-0.135697 17:-0.424504 18:-0.407447 19:0.311303 20:0.0331443 21:0.272932 22:-0.0682978 23:-0.460449 24:-0.146869 25:0.306308 26:-0.0452738 27:-0.467524 28:-0.198853 29:0.360506 30:0.0106452 31:-0.350107 32:-0.384008 33:0.357958 34:0.00019348 35:-0.351387 36:-0.392009
+-721.3427820150104 1:-0.420444 2:0.65136 3:0.0890786 4:-0.67696 5:0.299478 6:0.421817 7:0.0841891 8:-0.597937 9:0.282497 10:0.448076 11:0.0914364 12:-0.393384 13:0.36485 14:0.352549 15:0.0968455 16:-0.360091 17:0.349963 18:0.345414 19:0.737134 20:0.658978 21:0.77646 22:-0.424026 23:0.2829 24:0.604027 25:0.76161 26:-0.451646 27:0.323319 28:0.558674 29:0.821652 30:-0.161031 31:0.459506 32:0.359782 33:0.824199 34:-0.15929 35:0.457508 36:0.35546
+-1024 1:-0.475291 2:0.303823 3:0.00987326 4:-0.547069 5:-0.1849 6:-0.0786103 7:0.0120034 8:-0.51755 9:-0.187817 10:-0.0973601 11:0.0243656 12:-0.337481 13:-0.13489 14:-0.174858 15:0.0251469 16:-0.320258 17:-0.135558 18:-0.177886 19:0.410581 20:0.224658 21:0.512931 22:-0.431983 23:-0.188984 24:-0.131093 25:0.506545 26:-0.467631 27:-0.180173 28:-0.201563 29:0.582161 30:0.00467615 31:-0.190849 32:-0.214029 33:0.574518 34:-0.055706 35:-0.169878 36:-0.230963
+1024 1:-0.550423 2:0.0218265 3:-0.145709 4:-0.15958 5:-0.534816 6:-0.229332 7:-0.140858 8:-0.157846 9:-0.515851 10:-0.270967 11:-0.099107 12:-0.126734 13:-0.445526 14:-0.417005 15:-0.110624 16:-0.170531 17:-0.429192 18:-0.430642 19:0.170193 20:0.0223972 21:0.232932 22:-0.203493 23:-0.421163 24:-0.238944 25:0.225259 26:-0.24998 27:-0.370773 28:-0.266431 29:0.299358 30:0.00143789 31:-0.347861 32:-0.388323 33:0.307002 34:-0.0853084 35:-0.328195 36:-0.417696
+-1024 1:-0.494954 2:0.47715 3:-0.00568521 4:-0.622332 5:0.0582421 6:0.162002 7:0.00917255 8:-0.678351 9:0.0821398 10:0.105205 11:0.0121706 12:-0.378361 13:0.106954 14:0.0679199 15:0.0114171 16:-0.376068 17:0.110475 18:0.0589474 19:0.523595 20:0.421958 21:0.578814 22:-0.364833 23:-0.00203722 24:0.23541 25:0.597128 26:-0.549981 27:0.0945173 28:0.0882392 29:0.67643 30:-0.047555 31:0.0706219 32:0.0315997 33:0.668786 34:-0.113863 35:0.095235 36:0.00829569
+330.5131614427574 1:-0.486882 2:0.00249681 3:-0.124494 4:0.0448619 5:-0.578098 6:-0.115508 7:-0.106889 8:-0.217821 9:-0.501388 10:-0.295072 11:-0.0137437 12:-0.371836 13:-0.433561 14:-0.517973 15:-0.0389255 16:-0.319831 17:-0.430063 18:-0.501757 19:0.529214 20:0.0492076 21:0.343521 22:0.108168 23:-0.455061 24:0.083642 25:0.39689 26:-0.106345 27:-0.391636 28:-0.136262 29:0.638213 30:-0.381539 31:-0.263361 32:-0.499832 33:0.622926 34:-0.317807 35:-0.271528 36:-0.476314
+-1024 1:-0.490815 2:-0.0343956 3:-0.093377 4:-0.269208 5:-0.545734 6:-0.3396 7:-0.133782 8:-0.0160415 9:-0.575439 10:-0.2544 11:-0.0625236 12:-0.294393 13:-0.458245 14:-0.511065 15:-0.0618086 16:-0.0839798 17:-0.499677 18:-0.467288 19:0.500492 20:0.00166831 21:0.338815 22:-0.121909 23:-0.442322 24:-0.176184 25:0.370671 26:0.072087 27:-0.502702 28:-0.131325 29:0.515919 30:-0.202984 31:-0.338577 32:-0.483729 33:0.543944 34:-0.161838 35:-0.354401 36:-0.477572
+1024 1:0.515694 2:0.904852 3:0.58273 4:-0.909419 5:0.782614 6:0.7999 7:0.57111 8:-0.904834 9:0.781788 10:0.797764 11:0.580778 12:-0.545464 13:0.818353 14:0.775664 15:0.591138 16:-0.481297 17:0.792954 18:0.78154 19:0.975649 20:0.899042 21:0.959997 22:-0.876655 23:0.8069 24:0.686539 25:0.957089 26:-0.604175 27:0.680336 28:0.877367 29:0.97707 30:-0.137461 31:0.829151 32:0.775319 33:1 34:-0.129946 35:0.821192 36:0.768773
+1024 1:-0.46101 2:0.295859 3:-0.00709961 4:-0.048535 5:-0.336553 6:0.198391 7:-0.013474 8:-0.00179743 9:-0.321503 10:0.16848 11:0.0213168 12:-0.14346 13:-0.169285 14:-0.0978391 15:0.0251469 16:-0.13222 17:-0.173974 18:-0.107514 19:0.630989 20:0.423644 21:0.604694 22:0.0221854 23:-0.141389 24:0.556526 25:0.604279 26:-0.0698507 27:-0.0890047 28:0.423643 29:0.707001 30:-0.231669 31:0.151731 32:-0.0124284 33:0.724835 34:-0.275056 35:0.160692 36:-0.0346974
+-1024 1:-0.475912 2:0.599904 3:0.0310894 4:-0.804042 5:0.281084 6:0.250598 7:0.0261574 8:-0.794368 9:0.284741 10:0.24661 11:0.0274143 12:-0.402316 13:0.298183 14:0.266301 15:0.0297235 16:-0.384433 17:0.2952 18:0.260704 19:0.496746 20:0.47788 21:0.599988 22:-0.584301 23:0.151473 24:0.168489 25:0.59236 26:-0.632527 27:0.184344 28:0.109274 29:0.635665 30:0.0729617 31:0.134536 32:0.188789 33:0.617831 34:0.00890813 35:0.168459 36:0.173959
+1024 1:-0.481087 2:0.173552 3:-0.0820617 4:-0.184489 5:-0.364326 6:0.014447 7:-0.067258 8:-0.386523 9:-0.301095 10:-0.144024 11:-0.00764625 12:-0.414585 13:-0.247408 14:-0.341984 15:-0.0175681 16:-0.365935 17:-0.2478 18:-0.328962 19:0.603516 20:0.182354 21:0.463519 22:-0.047862 23:-0.295478 24:0.173845 25:0.525616 26:-0.224487 27:-0.247285 28:-0.0328301 29:0.709549 30:-0.342672 31:-0.134783 32:-0.359298 33:0.699357 34:-0.289026 35:-0.141877 36:-0.337247
+-1024 1:-0.47881 2:0.0225373 3:-0.0693321 4:0.434863 5:-0.695363 6:0.0240399 7:-0.0630118 8:0.459977 9:-0.675595 10:-0.0524721 11:-0.0122194 12:-0.0195584 13:-0.484909 14:-0.407979 15:-0.014517 16:-0.067635 17:-0.472075 18:-0.425502 19:0.566053 20:0.294597 21:0.501166 22:0.154584 23:-0.277343 24:0.496321 25:0.492243 26:0.00182893 27:-0.216536 28:0.29567 29:0.628022 30:-0.344372 31:0.0401922 32:-0.198985 33:0.656047 34:-0.364467 35:0.0302183 36:-0.221059
+1024 1:1 2:0.995438 3:1 4:-0.996357 5:1 6:0.990198 7:1 8:-0.991443 9:0.990591 10:0.990113 11:1 12:-0.524568 13:0.984938 14:0.998693 15:1 16:-0.522454 17:0.991834 18:0.990584 19:0.993756 20:0.986837 21:0.995294 22:-0.989289 23:0.973112 24:0.737383 25:0.973776 26:-0.687407 27:0.853748 28:1 29:0.989809 30:-0.131065 31:0.980783 32:0.943729 33:0.987261 34:-0.11741 35:0.970777 36:0.939498
+1024 1:-0.447971 2:0.680908 3:0.0735215 4:-0.553558 5:0.271194 6:0.527565 7:0.0926816 8:-0.829396 9:0.370995 10:0.325036 11:0.0914364 12:-0.475346 13:0.402761 14:0.334355 15:0.0907434 16:-0.441174 17:0.397682 18:0.337713 19:0.591028 20:0.659368 21:0.722342 22:-0.312761 23:0.212269 24:0.6585 25:0.737772 26:-0.909879 27:0.474146 28:0.158407 29:0.763052 30:-0.174129 31:0.440838 32:0.327805 33:0.757957 34:-0.219559 35:0.465217 36:0.313209
+-997.4220760163713 1:-0.518342 2:0.21787 3:-0.0438728 4:-0.561288 5:-0.261719 6:-0.194853 7:-0.0446115 8:-0.500015 9:-0.262154 10:-0.183297 11:-0.0305118 12:-0.233685 13:-0.23514 14:-0.229992 15:-0.0297723 16:-0.250376 17:-0.232738 18:-0.248352 19:0.245743 20:0.0640795 21:0.359992 22:-0.326821 23:-0.384774 24:-0.309475 25:0.315843 26:-0.213429 27:-0.380925 28:-0.23785 29:0.411462 30:0.16141 31:-0.379144 32:-0.327163 33:0.414009 34:0.0541824 35:-0.352256 36:-0.361021
+169.7530673582867 1:1 2:0.994598 3:0.954736 4:-0.983477 5:0.990115 6:0.991257 7:0.951873 8:-0.994009 9:0.995829 10:0.990884 11:0.961889 12:-0.570305 13:1 14:0.977083 15:0.958809 16:-0.526153 17:0.993727 18:0.98817 19:1 20:0.989613 21:1 22:-0.698102 23:0.825588 24:1 25:1 26:-1 27:1 28:0.730251 29:1 30:-0.134626 31:0.990658 32:0.951296 33:0.997452 34:-0.1553 35:1 36:0.93551
+-1024 1:1 2:0.970229 3:0.81046 4:-0.975292 5:0.927129 6:0.914574 7:0.80891 8:-0.956584 9:0.915208 10:0.923616 11:0.812495 12:-0.545699 13:0.931453 14:0.913693 15:0.800146 16:-0.492216 17:0.92119 18:0.927649 19:0.849524 20:0.966879 21:0.936468 22:-0.93669 23:0.91643 24:0.736168 25:0.942786 26:-0.93068 27:0.915385 28:0.713924 29:0.936303 30:-0.0550626 31:0.895073 32:0.920949 33:0.936303 34:-0.0650011 35:0.915692 36:0.927872
+-1024 1:-0.425825 2:0.612899 3:0.096152 4:-0.729308 5:0.249748 6:0.300319 7:0.0983432 8:-0.735523 9:0.262319 10:0.284543 11:0.103631 12:-0.365841 13:0.279881 14:0.277204 15:0.101422 16:-0.338397 17:0.276642 18:0.278424 19:0.448668 20:0.501144 21:0.585871 22:-0.515564 23:0.121543 24:0.21609 25:0.580441 26:-0.528208 27:0.144277 28:0.181868 29:0.625474 30:0.0570926 31:0.150002 32:0.193569 33:0.625474 34:0.108018 35:0.131624 36:0.209317
+1024 1:-0.448592 2:0.627674 3:0.0622063 4:-0.625968 5:0.242093 6:0.403264 7:0.0714505 8:-0.598417 9:0.230318 10:0.37812 11:0.0822902 12:-0.381376 13:0.308492 14:0.297017 15:0.0785391 16:-0.364533 17:0.310131 18:0.295713 19:0.59977 20:0.607661 21:0.729401 22:-0.615 23:0.278561 24:0.318069 25:0.704399 26:-0.47198 27:0.245741 28:0.411444 29:0.768148 30:-0.0734047 31:0.332982 32:0.294011 33:0.727383 34:0.00333753 35:0.329092 36:0.346178
+1024 1:-0.447764 2:0.378902 3:-0.00568521 4:-0.379207 5:-0.100594 6:0.205386 7:0.00634173 8:-0.45903 9:-0.0801678 10:0.0965384 11:0.0319874 12:-0.416366 13:0.0106339 14:-0.061864 15:0.0327731 16:-0.335424 17:-0.0111278 18:-0.048315 19:0.600394 20:0.328997 21:0.508225 22:-0.150994 23:-0.118046 24:0.341018 25:0.587592 26:-0.303443 27:-0.0916106 28:0.11691 29:0.691714 30:-0.302146 31:0.0488544 32:-0.161213 33:0.668786 34:-0.207362 35:0.0263878 36:-0.124795
+1024 1:-0.465357 2:0.605085 3:0.0523054 4:-0.693909 5:0.230566 6:0.310882 7:0.0488039 8:-0.69361 9:0.242027 10:0.295389 11:0.060949 12:-0.381967 13:0.276828 14:0.259382 15:0.0556561 16:-0.371352 17:0.279818 18:0.254892 19:0.458658 20:0.527743 21:0.604694 22:-0.608678 23:0.197742 24:0.200845 25:0.616198 26:-0.616177 27:0.207978 28:0.163493 29:0.640761 30:0.0576323 31:0.192081 32:0.240692 33:0.645856 34:-0.00368418 35:0.205955 36:0.205509
+1024 1:-0.1876 2:0.808805 3:0.281448 4:-0.816533 5:0.55527 6:0.602112 7:0.28377 8:-0.81549 9:0.55419 10:0.585436 11:0.289611 12:-0.441485 13:0.582114 14:0.574722 15:0.287543 16:-0.458852 17:0.594636 18:0.558478 19:0.732763 20:0.774434 21:0.83058 22:-0.707039 23:0.533707 24:0.553499 25:0.816438 26:-0.748456 27:0.565951 28:0.494469 29:0.829295 30:0.0927941 31:0.533157 32:0.653127 33:0.84713 34:-0.0388572 35:0.568578 36:0.57347
+1024 1:-0.478396 2:0.163346 3:-0.0947914 4:-0.136136 5:-0.381514 6:0.0363888 7:-0.0615964 8:-0.246039 9:-0.36298 10:-0.114266 11:-0.0152681 12:-0.399414 13:-0.250772 14:-0.337981 15:-0.0190936 16:-0.296847 17:-0.269225 18:-0.314635 19:0.500492 20:0.159112 21:0.362344 22:0.0269042 23:-0.313412 24:0.242142 25:0.449334 26:-0.222262 27:-0.263381 28:-0.0593791 29:0.605092 30:-0.394968 31:-0.126082 32:-0.384545 33:0.566875 34:-0.265994 35:-0.148618 36:-0.335368
+250.8309024828724 1:-0.517307 2:0.20724 3:-0.0863049 4:-0.343868 5:-0.315085 6:-0.0684961 7:-0.0941508 8:-0.213996 9:-0.332558 10:-0.0425597 11:-0.0609993 12:-0.206506 13:-0.237305 14:-0.218079 15:-0.0618086 16:-0.194686 17:-0.238227 18:-0.223599 19:0.351264 20:0.139587 21:0.357639 22:-0.137445 23:-0.358151 24:-0.0527713 25:0.380206 26:-0.146464 27:-0.348789 28:-0.112044 29:0.426748 30:0.0639799 31:-0.260643 32:-0.255071 33:0.429296 34:0.0360195 35:-0.256154 36:-0.268587
+-1024 1:-0.429551 2:0.723353 3:0.0975664 4:-0.697765 5:0.393637 6:0.519827 7:0.0997586 8:-0.772244 9:0.423746 10:0.448945 11:0.109729 12:-0.491098 13:0.478992 14:0.408561 15:0.101422 16:-0.393378 17:0.457581 18:0.44228 19:0.632862 20:0.713104 21:0.734107 22:-0.539408 23:0.401249 24:0.604248 25:0.76161 26:-0.781191 27:0.504916 28:0.363143 29:0.791078 30:-0.100568 31:0.51145 32:0.462894 33:0.78853 34:-0.0438957 35:0.48481 36:0.477366
+-1024 1:1 2:1 3:0.978782 4:-0.977311 5:0.989871 6:1 7:0.981598 8:-0.976616 9:0.986134 10:1 11:0.992378 12:-0.553658 13:0.99665 14:0.987982 15:0.977117 16:-0.524282 17:1 18:0.997526 19:0.876997 20:1 21:0.96235 22:-0.965056 23:0.971934 24:0.768585 25:0.959473 26:-0.965422 27:0.987341 28:0.758638 29:0.966879 30:-0.0895075 31:0.965278 32:0.965548 33:0.959235 34:-0.100884 35:0.972125 36:0.955633
+-1024 1:1 2:0.977517 3:0.824605 4:-0.971923 5:0.937075 6:0.931897 7:0.824481 8:-0.984206 9:0.942513 10:0.927874 11:0.832311 12:-0.523977 13:0.936823 14:0.938243 15:0.82303 16:-0.506946 17:0.936336 18:0.933626 19:0.867006 20:0.968187 21:0.967056 22:-0.921383 23:0.89299 24:0.733202 25:0.952322 26:-0.965545 27:0.941571 28:0.698637 29:0.951592 30:-0.0343603 31:0.887816 32:0.93288 33:0.946494 34:-0.0606156 35:0.908726 36:0.924712
+-919.2402807160911 1:-0.497644 2:0.311459 3:-0.0311431 4:-0.304467 5:-0.243085 6:0.0766062 7:-0.0403653 8:-0.16557 9:-0.255295 10:0.117664 11:-0.0045975 12:-0.207236 13:-0.13849 14:-0.101946 15:-0.00994038 16:-0.169831 17:-0.140113 18:-0.0925078 19:0.388727 20:0.353032 21:0.470578 22:-0.287814 23:-0.108949 24:0.165085 25:0.466021 26:-0.116925 27:-0.149502 28:0.253276 29:0.554136 30:-0.0266674 31:-0.0062656 32:-0.0385997 33:0.54904 34:-0.00623166 35:-0.0178197 36:-0.0388723
+-1019.959044742672 1:-0.179528 2:0.731121 3:0.289934 4:-0.760195 5:0.43866 6:0.521532 7:0.28377 8:-0.652523 9:0.413911 10:0.578417 11:0.304855 12:-0.42648 13:0.481096 14:0.472578 15:0.301273 16:-0.425487 17:0.485799 18:0.462464 19:0.717154 20:0.713257 21:0.781166 22:-0.637944 23:0.433998 24:0.509803 25:0.740156 26:-0.424945 27:0.38311 28:0.684784 29:0.819104 30:-0.0597911 31:0.497145 32:0.482373 33:0.816556 34:-0.114782 35:0.52906 36:0.466429
+-185.9445895936523 1:-0.450455 2:0.212026 3:-0.0523592 4:0.199267 5:-0.432333 6:0.306002 7:0.0233266 8:-0.600171 9:-0.247884 10:-0.239215 11:0.0502784 12:-0.560496 13:-0.182822 14:-0.347672 15:0.0510795 16:-0.388859 17:-0.22155 18:-0.309409 19:0.569174 20:0.167096 21:0.487049 22:0.61679 23:-0.552649 24:0.583898 25:0.549454 26:-0.63391 27:-0.160844 28:-0.337541 29:0.689166 30:-0.613567 31:-0.0772533 32:-0.454467 33:0.689166 34:-0.509674 35:-0.113995 36:-0.432407
+1024 1:-0.419202 2:0.767384 3:0.123027 4:-0.792846 5:0.48063 6:0.520523 7:0.116743 8:-0.768859 9:0.483752 10:0.52948 11:0.124973 12:-0.422422 13:0.518937 14:0.507146 15:0.119729 16:-0.431288 17:0.527677 18:0.493032 19:0.625369 20:0.766778 21:0.778813 22:-0.731543 23:0.524324 24:0.498469 25:0.737772 26:-0.662965 27:0.53541 28:0.562315 29:0.793626 30:-0.0839251 31:0.569538 32:0.539046 33:0.798722 34:-0.0661217 35:0.561779 36:0.541609
+1024 1:-0.325029 2:0.802682 3:0.205066 4:-0.815164 5:0.562215 6:0.604898 7:0.212997 8:-0.777003 9:0.539774 10:0.60143 11:0.217961 12:-0.485129 13:0.604121 14:0.560857 15:0.217364 16:-0.452315 17:0.596848 18:0.561403 19:0.642852 20:0.802379 21:0.802342 22:-0.788963 23:0.600806 24:0.521213 25:0.78068 26:-0.716687 27:0.589966 28:0.566604 29:0.81146 30:-0.0738639 31:0.622835 32:0.60549 33:0.798722 34:0.0183241 35:0.597421 36:0.653453
+-1024 1:-0.463701 2:0.469617 3:0.00987326 4:-0.498157 5:0.0137697 6:0.23707 7:0.0275728 8:-0.496353 9:0.0403062 10:0.22468 11:0.0518027 12:-0.383627 13:0.0865043 14:0.044203 15:0.0541305 16:-0.441362 17:0.107674 18:0.0152765 19:0.71091 20:0.461571 21:0.694106 22:-0.369786 23:0.0494584 24:0.316136 25:0.671025 26:-0.415207 27:0.132903 28:0.318289 29:0.801269 30:-0.206456 31:0.187962 32:0.0450702 33:0.796174 34:-0.277039 35:0.214426 36:0.0197573
+-1024 1:1 2:0.992267 3:0.978782 4:-0.992642 5:0.992581 6:0.984404 7:0.973106 8:-0.984754 9:0.985015 10:0.989078 11:0.984756 12:-0.520536 13:0.975638 14:0.990862 15:0.971013 16:-0.48697 17:0.975123 18:1 19:0.88324 20:0.991493 21:0.96235 22:-0.974616 23:0.96368 24:0.74302 25:0.933251 26:-0.962685 27:0.978405 28:0.747356 29:0.949042 30:-0.0765141 31:0.951348 32:0.962243 33:0.933755 34:-0.032956 35:0.954165 36:1
+1024 1:1 2:0.995204 3:0.954736 4:-0.98594 5:0.996838 6:0.996505 7:0.950458 8:-0.984476 9:0.992221 10:0.996957 11:0.954267 12:-0.554561 13:0.996595 14:0.985793 15:0.960335 16:-0.522548 17:0.986879 18:0.983439 19:0.896977 20:0.994959 21:0.959997 22:-0.979436 23:0.989421 24:0.76865 25:0.966624 26:-0.964157 27:0.982994 28:0.755904 29:0.964331 30:-0.066461 31:0.96009 32:0.98173 33:0.964331 34:-0.0687901 35:0.947337 36:0.959332
+-1024 1:-0.48978 2:0.117964 3:-0.0848905 4:-0.101157 5:-0.464646 6:-0.0634582 7:-0.0644272 8:-0.116718 9:-0.425331 10:-0.0989052 11:-2.43899e-05 12:-0.253521 13:-0.344738 14:-0.363467 15:0.000738359 16:-0.406126 17:-0.304247 18:-0.409009 19:0.425566 20:0.14435 21:0.383521 22:-0.0309893 23:-0.357312 24:0.0840938 25:0.327762 26:-0.149183 27:-0.249728 28:0.0414971 29:0.500632 30:-0.209501 31:-0.160313 32:-0.315308 33:0.564327 34:-0.39831 35:-0.132817 36:-0.395117
+-1024 1:0.0110943 2:0.847201 3:0.414408 4:-0.844543 5:0.659194 6:0.710731 7:0.412578 8:-0.815626 9:0.647647 10:0.713984 11:0.416138 12:-0.474981 13:0.693928 14:0.682464 15:0.420271 16:-0.452221 17:0.686628 18:0.675555 19:0.923825 20:0.861216 21:0.929409 22:-0.783594 23:0.711441 24:0.695399 25:0.930867 26:-0.519538 27:0.586407 28:0.865258 29:0.951592 30:-0.103854 31:0.759145 32:0.729729 33:0.956687 34:-0.165941 35:0.781594 36:0.693604
+1024 1:-0.505923 2:0.13687 3:-0.0764041 4:-0.0344203 5:-0.489027 6:-0.0395551 7:-0.0913199 8:0.143602 9:-0.491228 10:0.0217592 11:-0.0579505 12:-0.190406 13:-0.33164 14:-0.317343 15:-0.0526553 16:-0.0945232 17:-0.355513 18:-0.302512 19:0.546697 20:0.272221 21:0.512931 22:-0.0980861 23:-0.221696 24:0.236881 25:0.470788 26:0.0706598 27:-0.256527 28:0.315305 29:0.615283 30:-0.222913 31:-0.0351971 32:-0.196206 33:0.648404 34:-0.25079 35:-0.0389859 36:-0.217506
+503.4380643555667 1:-0.413407 2:0.279652 3:0.00421564 4:-0.171876 5:-0.278925 6:0.159818 7:-0.00215071 8:-0.184019 9:-0.217075 10:0.158144 11:0.0822902 12:-0.56133 13:-0.0919355 14:-0.252928 15:0.0800646 16:-0.367157 17:-0.137918 18:-0.205113 19:0.76211 20:0.287717 21:0.564696 22:-0.112962 23:-0.166659 24:0.313405 25:0.470788 26:0.00346402 27:-0.116288 28:0.474425 29:0.821652 30:-0.690448 31:0.0984787 32:-0.341753 33:0.824199 34:-0.544186 35:0.0489884 36:-0.302799
+1024 1:1 2:0.991704 3:0.943421 4:-0.986956 5:0.987487 6:0.98304 7:0.943381 8:-0.969765 9:0.972714 10:0.988735 11:0.952743 12:-0.526792 13:0.976811 14:0.985966 15:0.94813 16:-0.499496 17:0.971823 18:0.985005 19:0.871377 20:0.989949 21:0.955292 22:-0.980842 23:0.967063 24:0.736766 25:0.94517 26:-0.950891 27:0.967444 28:0.752198 29:0.941398 30:-0.0398782 31:0.934976 32:0.979026 33:0.946494 34:-0.0500871 35:0.947015 36:0.976212
+1024 1:-0.463701 2:0.202506 3:-0.0396295 4:-0.063045 5:-0.420618 6:0.0451185 7:-0.060181 8:0.269957 9:-0.465372 10:0.184453 11:0.0121706 12:-0.115587 13:-0.282303 14:-0.21559 15:0.00531496 16:-0.243413 17:-0.245963 18:-0.25767 19:0.530463 20:0.34257 21:0.531755 22:-0.240748 23:-0.125236 24:0.205875 25:0.437415 26:0.114863 27:-0.18592 28:0.506617 29:0.635665 30:-0.219602 31:0.0500189 32:-0.108694 33:0.661143 34:-0.356406 35:0.0772529 36:-0.171446
+1024 1:-0.230443 2:0.748654 3:0.261645 4:-0.697083 5:0.444501 6:0.600501 7:0.262537 8:-0.670348 9:0.437783 10:0.589078 11:0.277414 12:-0.439139 13:0.512819 14:0.497365 15:0.279914 16:-0.465234 17:0.524084 18:0.4739 19:0.774597 20:0.778961 21:0.814107 22:-0.558561 23:0.501747 24:0.731324 25:0.818821 26:-0.659279 27:0.561051 28:0.612494 29:0.882799 30:-0.135955 31:0.622598 32:0.554086 33:0.867512 34:-0.102496 35:0.611901 36:0.566146
+-500.2791148107752 1:-0.123438 2:0.809694 3:0.356414 4:-0.830653 5:0.58514 6:0.629574 7:0.351712 8:-0.851277 9:0.60245 10:0.614026 11:0.359735 12:-0.507468 13:0.632901 14:0.585111 15:0.360772 16:-0.472351 17:0.621582 18:0.582743 19:0.897601 20:0.795262 21:0.917642 22:-0.739415 23:0.5905 24:0.594093 25:0.897492 26:-0.527966 27:0.506993 28:0.732458 29:0.93885 30:-0.155811 31:0.663117 32:0.581853 33:0.918468 34:-0.189159 35:0.683318 36:0.569313
+1024 1:-0.293777 2:0.701177 3:0.213553 4:-0.731856 5:0.397116 6:0.494928 7:0.212997 8:-0.75037 9:0.409336 10:0.464982 11:0.225584 12:-0.499205 13:0.471526 14:0.402836 15:0.228045 16:-0.420275 17:0.442696 18:0.413413 19:0.704666 20:0.659445 21:0.743518 22:-0.559865 23:0.356523 24:0.509171 25:0.74254 26:-0.48115 27:0.333098 28:0.530093 29:0.798722 30:-0.218772 31:0.484267 32:0.338546 33:0.791078 34:-0.117241 35:0.4504 36:0.380641
+-1024 1:1 2:0.995537 3:0.954736 4:-1 5:0.999062 6:0.981857 7:0.950458 8:-1 9:1 10:0.989287 11:0.957316 12:-0.542058 13:0.988299 14:0.986713 15:0.952707 16:-0.503614 17:0.98299 18:0.994485 19:0.881367 20:0.993908 21:0.97647 22:-0.990759 23:0.981283 24:0.74298 25:0.940403 26:-0.950577 27:0.975512 28:0.762409 29:0.95414 30:-0.0257813 31:0.941534 32:1 33:0.961783 34:-0.0570846 35:0.942416 36:0.964986
+758.3408218340658 1:1 2:0.937444 3:0.722762 4:-0.915993 5:0.84056 6:0.874024 7:0.719735 8:-0.91224 9:0.839712 10:0.87262 11:0.719503 12:-0.52203 13:0.869326 14:0.858536 15:0.725392 16:-0.520386 17:0.869067 18:0.842185 19:0.860762 20:0.940461 21:0.927056 22:-0.819316 23:0.817612 24:0.785195 25:0.928483 26:-0.890921 27:0.873515 28:0.715044 29:0.93885 30:-0.0758455 31:0.866773 32:0.870871 33:0.936303 34:-0.0931282 35:0.866513 36:0.848704
+-588.6864872804878 1:-0.515858 2:0.0762993 3:-0.0806473 4:-0.0647174 5:-0.53428 6:-0.144303 7:-0.0955662 8:0.104065 9:-0.538969 10:-0.0959863 11:-0.0411824 12:-0.119462 13:-0.411056 14:-0.37128 15:-0.0419766 16:-0.135082 17:-0.403775 18:-0.380716 19:0.27946 20:0.166023 21:0.364697 22:-0.225676 23:-0.305319 24:-0.0735124 25:0.258632 26:0.0649398 27:-0.344299 28:0.14589 29:0.454774 30:-0.161579 31:-0.191382 32:-0.319269 33:0.447131 34:-0.101593 35:-0.208801 36:-0.302264
+1024 1:-0.500128 2:0.287765 3:-0.0339719 4:-0.49415 5:-0.194639 6:-0.0436956 7:-0.0332883 8:-0.504687 9:-0.178548 10:-0.0772572 11:-0.02289 12:-0.182916 13:-0.159627 14:-0.11232 15:-0.0267213 16:-0.218874 17:-0.146313 18:-0.131065 19:0.331284 20:0.170803 21:0.407048 22:-0.228195 23:-0.310653 24:-0.0827335 25:0.415961 26:-0.281985 27:-0.273935 28:-0.143863 29:0.444583 30:0.159275 31:-0.262116 32:-0.196138 33:0.457322 34:0.152696 35:-0.265092 36:-0.20494
+-822.3472117431337 1:-0.446108 2:0.365541 3:-2.75809e-05 4:0.0996181 5:-0.288127 6:0.447519 7:0.0714505 8:-0.602215 9:-0.102867 10:-0.0552797 11:0.077717 12:-0.445908 13:-0.0316862 14:-0.123084 15:0.0800646 16:-0.372257 17:-0.0503182 18:-0.112091 19:0.639106 20:0.421012 21:0.599988 22:0.283281 23:-0.230465 24:0.790636 25:0.680561 26:-0.891764 27:0.205516 28:-0.146941 29:0.755409 30:-0.461521 31:0.216835 32:-0.101915 33:0.763052 34:-0.410016 35:0.187342 36:-0.0973447
+-1024 1:1 2:0.993158 3:0.983026 4:-0.978101 5:0.986024 6:0.994509 7:0.980183 8:-0.991571 9:0.988507 10:0.986249 11:0.986281 12:-0.525384 13:0.979047 14:0.990839 15:0.977117 16:-0.530417 17:0.991646 18:0.98294 19:0.985639 20:0.991322 21:0.985882 22:-1 23:1 24:0.754269 25:0.985697 26:-0.705617 27:0.858681 28:0.980011 29:0.994904 30:-0.150793 31:1 32:0.945911 33:0.984713 34:-0.117418 35:0.992484 36:0.962657
+1024 1:1 2:0.996847 3:0.963223 4:-0.991498 5:0.998116 6:0.99171 7:0.960366 8:-0.993 9:0.999377 10:0.996906 11:0.966462 12:-0.541458 13:0.994199 14:0.994241 15:0.97254 16:-0.527512 17:0.991898 18:0.985471 19:0.923201 20:0.998393 21:0.978823 22:-0.977157 23:0.982191 24:0.765533 25:0.983313 26:-0.958746 27:0.977665 28:0.759281 29:0.979618 30:-0.0623044 31:0.95737 32:0.983142 33:0.997452 34:-0.070346 35:0.951254 36:0.963002
+-1024 1:-0.479638 2:-0.0476653 3:-0.134395 4:0.705816 5:-0.760098 6:0.160327 7:-0.0332883 8:-0.458122 9:-0.524365 10:-0.49135 11:0.0319874 12:-0.629424 13:-0.442185 14:-0.632766 15:0.0342986 16:-0.391123 17:-0.487018 18:-0.589564 19:0.489254 20:0.016826 21:0.399989 22:0.899701 23:-0.736907 24:0.499719 25:0.42788 26:-0.594477 27:-0.296139 28:-0.475889 29:0.630569 30:-0.793508 31:-0.198382 32:-0.636779 33:0.628022 34:-0.679485 35:-0.22915 36:-0.611026
+-1024 1:1 2:0.982049 3:0.816117 4:-0.961645 5:0.93527 6:0.941759 7:0.82165 8:-0.955007 9:0.933918 10:0.949398 11:0.830787 12:-0.541024 13:0.9418 14:0.929953 15:0.826081 16:-0.508296 17:0.936821 18:0.93318 19:0.887611 20:0.983623 21:0.96235 22:-0.9456 23:0.933483 24:0.748441 25:0.961857 26:-0.935153 27:0.955639 28:0.762299 29:0.969426 30:-0.0783266 31:0.937533 32:0.946129 33:0.951592 34:-0.0577215 35:0.939386 36:0.960822
+-1024 1:-0.0779039 2:0.828445 3:0.373388 4:-0.826275 5:0.616892 6:0.676056 7:0.358789 8:-0.796934 9:0.616681 10:0.69056 11:0.374979 12:-0.564927 13:0.689908 14:0.605111 15:0.374502 16:-0.478221 17:0.667025 18:0.630036 19:0.825797 20:0.801642 21:0.863522 22:-0.73051 23:0.594903 24:0.607244 25:0.830743 26:-0.587464 27:0.567564 28:0.726795 29:0.900633 30:-0.168611 31:0.676556 32:0.583791 33:0.903181 34:-0.11317 35:0.655053 36:0.60396
+1024 1:-0.425618 2:0.594087 3:0.0593775 4:-0.612973 5:0.223782 6:0.393825 7:0.0643734 8:-0.681803 9:0.255045 10:0.325181 11:0.077717 12:-0.426645 13:0.29196 14:0.245483 15:0.0800646 16:-0.430579 17:0.294221 18:0.229745 19:0.659711 20:0.564752 21:0.675283 22:-0.452094 23:0.216791 24:0.453338 25:0.685328 26:-0.435446 27:0.220593 28:0.422179 29:0.750313 30:-0.225305 31:0.333896 32:0.177819 33:0.757957 34:-0.188619 35:0.32229 36:0.190816
+1024 1:-0.491228 2:0.289804 3:-0.0467016 4:-0.253962 5:-0.27257 6:0.0843275 7:-0.0488577 8:-0.109989 9:-0.278654 10:0.134275 11:-0.0213656 12:-0.23802 13:-0.152831 14:-0.138034 15:-0.0267213 16:-0.143549 17:-0.173927 18:-0.116329 19:0.595399 20:0.360876 21:0.588223 22:-0.18286 23:-0.119563 24:0.297102 25:0.539919 26:-0.0426386 27:-0.135139 28:0.379342 29:0.661143 30:-0.190208 31:0.0598809 32:-0.0782941 33:0.684073 34:-0.204178 35:0.058158 36:-0.0904363
+-1024 1:-0.503026 2:-0.0223139 3:-0.103278 4:-0.0141695 5:-0.609636 6:-0.226268 7:-0.0276266 8:-0.481303 9:-0.499158 10:-0.473768 11:-0.0609993 12:0.14883 13:-0.538939 14:-0.39093 15:-0.0511298 16:-0.486005 17:-0.409372 18:-0.557243 19:0.254485 20:-0.0635642 21:0.237638 22:0.12707 23:-0.600409 24:-0.176867 25:0.222875 26:-0.523977 27:-0.369929 28:-0.523149 29:0.319741 30:0.046097 31:-0.455781 32:-0.477052 33:0.370697 34:-0.173237 35:-0.415966 36:-0.547607
+1024 1:-0.338069 2:0.743779 3:0.173947 4:-0.778847 5:0.466952 6:0.525181 7:0.170532 8:-0.701937 9:0.452623 10:0.566831 11:0.182899 12:-0.467891 13:0.52098 14:0.478525 15:0.18075 16:-0.372949 17:0.493517 18:0.504464 19:0.768353 20:0.751146 21:0.83058 22:-0.512975 23:0.427515 24:0.692929 25:0.792599 26:-0.598567 27:0.502563 28:0.613854 29:0.859869 30:-0.151977 31:0.602508 32:0.518549 33:0.862416 34:-0.181742 35:0.605555 36:0.49258
+1024 1:1 2:0.976268 3:0.838749 4:-0.980041 5:0.947081 6:0.93517 7:0.845712 8:-0.953727 9:0.934182 10:0.952707 11:0.841458 12:-0.545143 13:0.951227 14:0.93771 15:0.839811 16:-0.486722 17:0.936766 18:0.951554 19:0.919455 20:0.982745 21:0.97647 22:-0.935785 23:0.926499 24:0.756428 25:0.973776 26:-0.935872 27:0.950076 28:0.75566 29:0.979618 30:-0.0506805 31:0.926667 32:0.960681 33:0.971974 34:-0.0788027 35:0.945906 36:0.948556
+-1024 1:-0.481501 2:0.514207 3:0.0339182 4:-0.72707 5:0.123433 6:0.138624 7:0.0304037 8:-0.694768 9:0.123152 10:0.143228 11:0.0396077 12:-0.354398 13:0.14446 14:0.128017 15:0.0342986 16:-0.378776 17:0.157756 18:0.11131 19:0.424942 20:0.391446 21:0.55999 22:-0.551434 23:0.0079225 24:0.00741901 25:0.551838 26:-0.504002 27:0.0124944 28:0.0256723 29:0.599996 30:0.109308 31:-0.00424123 32:0.0609292 33:0.599996 34:0.031118 35:0.0144876 36:0.0231497
+927.9372165685564 1:-0.441555 2:0.398171 3:0.0282605 4:-0.403003 5:-0.111667 6:0.165467 7:0.0233266 8:-0.356322 9:-0.105196 10:0.158525 11:0.0350346 12:-0.203813 13:-0.011574 14:0.049846 15:0.0404007 16:-0.216251 17:-0.0137577 18:0.0267799 19:0.591653 20:0.373031 21:0.571755 22:-0.254825 23:-0.0802271 24:0.261769 25:0.56614 26:-0.3361 27:-0.0373794 28:0.157316 29:0.625474 30:0.221946 31:-0.0267884 32:0.118015 33:0.666239 34:-0.104963 35:0.0592804 36:-0.0228688
+-1024 1:-0.43845 2:0.728242 3:0.0989808 4:-0.843309 5:0.450769 6:0.422133 7:0.0955124 8:-0.823127 9:0.446202 10:0.423543 11:0.105156 12:-0.43523 13:0.463275 14:0.433037 15:0.105999 16:-0.45299 17:0.473808 18:0.414727 19:0.607262 20:0.678205 21:0.750577 22:-0.760011 23:0.43236 24:0.330643 25:0.740156 26:-0.748736 27:0.440278 28:0.318359 29:0.770696 30:-0.00837358 31:0.416651 32:0.436674 33:0.770696 34:0.051256 35:0.394411 36:0.457648
+-877.0819365651637 1:-0.500128 2:0.362281 3:-0.0141716 4:-0.38378 5:-0.174563 6:0.0948519 7:-0.0233818 8:-0.275461 9:-0.178624 10:0.128308 11:0.00149998 12:-0.256614 13:-0.0726906 14:-0.0565965 15:0.00531496 16:-0.216746 17:-0.0846204 18:-0.0571941 19:0.4162 20:0.361149 21:0.548226 22:-0.382403 23:-0.0835735 24:0.0862121 25:0.456485 26:-0.138704 27:-0.117079 28:0.277893 29:0.605092 30:-0.126876 31:0.0203207 32:-0.0778923 33:0.607639 34:-0.0519977 35:-0.010522 36:-0.061114
+1024 1:-0.372633 2:0.632203 3:0.147072 4:-0.79751 5:0.304397 6:0.29893 7:0.13373 8:-0.696873 9:0.280564 10:0.349051 11:0.143265 12:-0.457056 13:0.331291 14:0.272151 15:0.138035 16:-0.355827 17:0.314137 18:0.310505 19:0.646598 20:0.554795 21:0.67293 22:-0.615282 23:0.258446 24:0.283624 25:0.699631 26:-0.538889 27:0.222259 28:0.289229 29:0.729931 30:-0.0235822 31:0.258683 32:0.252165 33:0.727383 34:-0.00534488 35:0.255015 36:0.25905
+-1024 1:-0.439485 2:0.364731 3:0.00280123 4:-0.175872 5:-0.194765 6:0.287072 7:0.000680103 8:-0.131802 9:-0.193792 10:0.24388 11:0.0548515 12:-0.275313 13:-0.0576691 14:-0.0486258 15:0.0510795 16:-0.358151 17:-0.0311918 18:-0.0839215 19:0.55294 20:0.471824 21:0.548226 22:-0.127789 23:0.00447287 24:0.587442 25:0.587592 26:-0.305258 27:0.0529642 28:0.336071 29:0.691714 30:-0.235962 31:0.207862 32:0.0402934 33:0.69681 34:-0.226702 35:0.20269 36:0.0394714
+1024 1:-0.311576 2:0.736719 3:0.216382 4:-0.719463 5:0.428219 6:0.549439 7:0.217243 8:-0.856869 9:0.490361 10:0.456845 11:0.222534 12:-0.539573 13:0.522604 14:0.429053 15:0.221942 16:-0.489277 17:0.514191 18:0.440598 19:0.62412 20:0.679495 21:0.717636 22:-0.459034 23:0.329204 24:0.616927 25:0.716318 26:-0.832867 27:0.49699 28:0.279975 29:0.763052 30:-0.200454 31:0.507151 32:0.37542 33:0.780887 34:-0.127721 35:0.461827 36:0.383873
+-1024 1:-0.435553 2:0.615009 3:0.0805936 4:-0.571766 5:0.195139 6:0.404443 7:0.0799429 8:-0.648535 9:0.236265 10:0.335946 11:0.089912 12:-0.347768 13:0.280647 14:0.290296 15:0.0876923 16:-0.343455 17:0.287394 18:0.286087 19:0.546072 20:0.561964 21:0.654106 22:-0.342416 23:0.118456 24:0.459818 25:0.656723 26:-0.60708 27:0.245853 28:0.230528 29:0.69681 30:0.0590903 31:0.226591 32:0.280968 33:0.71974 34:-0.041977 35:0.254272 36:0.229821
+1024 1:-0.567394 2:-0.197294 3:-0.162681 4:-0.261266 5:-0.687635 6:-0.545512 7:-0.16492 8:-0.129588 9:-0.69267 10:-0.521468 11:-0.140265 12:0.00671642 13:-0.656303 14:-0.606278 15:-0.136558 16:-0.0860902 17:-0.640772 18:-0.634129 19:-0.0227424 20:-0.285621 21:0.00940594 22:-0.0693364 23:-0.698272 24:-0.56129 25:-0.0536416 26:0.115616 27:-0.711069 28:-0.470893 29:0.0675141 30:0.226618 31:-0.677188 32:-0.633629 33:0.0675141 34:0.0327061 35:-0.633945 36:-0.674122
+1024 1:-0.443625 2:0.174814 3:-0.0410439 4:0.472421 5:-0.562973 6:0.352825 7:0.0672042 8:-0.491025 9:-0.336171 10:-0.266636 11:0.0746683 12:-0.440286 13:-0.253363 14:-0.357088 15:0.0724369 16:-0.337415 17:-0.275835 18:-0.339339 19:0.559184 20:0.278687 21:0.508225 22:0.537305 23:-0.446206 24:0.722666 25:0.575676 26:-0.858821 27:0.0560864 28:-0.296263 29:0.6535 30:-0.59247 31:0.092027 32:-0.300928 33:0.681525 34:-0.463384 35:0.036757 36:-0.273548
+-1024 1:1 2:0.977756 3:0.81046 4:-0.955077 5:0.930125 6:0.943135 7:0.807495 8:-0.968357 9:0.931088 10:0.93016 11:0.820117 12:-0.544274 13:0.941055 14:0.92603 15:0.812352 16:-0.501307 17:0.935645 18:0.937077 19:0.888235 20:0.978246 21:0.96235 22:-0.939177 23:0.936458 24:0.762306 25:0.957089 26:-0.936249 27:0.936996 28:0.73583 29:0.959235 30:-0.0902647 31:0.944078 32:0.941671 33:0.959235 34:-0.0448228 35:0.907549 36:0.938581
+1024 1:-0.522067 2:0.0058352 3:-0.0976202 4:-0.286065 5:-0.528514 6:-0.327875 7:-0.101228 8:-0.230102 9:-0.5285 10:-0.343842 11:-0.064048 12:-0.229445 13:-0.453428 14:-0.475379 15:-0.0663852 16:-0.226786 17:-0.452089 18:-0.482535 19:0.225138 20:-0.0486438 21:0.251756 22:-0.206715 23:-0.485298 24:-0.342273 25:0.234794 26:-0.259841 27:-0.463999 28:-0.411669 29:0.335028 30:-0.0286168 31:-0.435971 32:-0.495288 33:0.33248 34:-0.0788107 35:-0.427873 36:-0.513267
+-1024 1:-0.446315 2:0.662726 3:0.0721071 4:-0.471187 5:0.219937 6:0.555784 7:0.0898508 8:-0.774857 9:0.329818 10:0.328571 11:0.0868633 12:-0.475598 13:0.381318 14:0.310071 15:0.0846412 16:-0.411278 17:0.368103 18:0.326623 19:0.645974 20:0.678628 21:0.734107 22:-0.245083 23:0.222904 24:0.782866 25:0.747307 26:-0.894956 27:0.521819 28:0.23654 29:0.783435 30:-0.252033 31:0.517148 32:0.345065 33:0.791078 34:-0.242745 35:0.51119 36:0.343066
+-1024 1:1 2:0.983257 3:0.841578 4:-0.969125 5:0.950759 6:0.953343 7:0.844297 8:-0.976928 9:0.94903 10:0.945281 11:0.846032 12:-0.55397 13:0.959473 14:0.940038 15:0.842864 16:-0.490695 17:0.944624 18:0.957343 19:0.911962 20:0.98874 21:0.981176 22:-0.967533 23:0.960047 24:0.751935 25:0.971392 26:-0.977727 27:0.977628 28:0.729913 29:0.974522 30:-0.0657682 31:0.943231 32:0.96432 33:0.964331 34:-0.0773274 35:0.947273 36:0.951207
+-1024 1:-0.49454 2:0.00819873 3:-0.0962058 4:-0.266958 5:-0.503623 6:-0.27495 7:-0.0502731 8:-0.345287 9:-0.489516 10:-0.370159 11:-0.010695 12:-0.249524 13:-0.45197 14:-0.481293 15:-0.0175681 16:-0.234569 17:-0.451793 18:-0.484055 19:0.429313 20:-0.0189177 21:0.294109 22:0.0563516 23:-0.523683 24:-0.112671 25:0.375438 26:-0.223543 27:-0.441449 28:-0.336121 29:0.546492 30:-0.120263 31:-0.4159 32:-0.516404 33:0.533753 34:-0.142095 35:-0.407527 36:-0.519588
+312.1551658652756 1:-0.590161 2:0.0135985 3:-0.185312 4:-0.54294 5:-0.41829 6:-0.388898 7:-0.190397 8:-0.500096 9:-0.416486 10:-0.391071 11:-0.170752 12:-0.22702 13:-0.406853 14:-0.426759 15:-0.174697 16:-0.207664 17:-0.412669 18:-0.433889 19:0.307557 20:-0.0820815 21:0.315285 22:-0.227465 23:-0.499687 24:-0.382521 25:0.339681 26:-0.23577 27:-0.489443 28:-0.420425 29:0.416557 30:0.149085 31:-0.501649 32:-0.471396 33:0.411462 34:0.0481684 35:-0.481221 36:-0.501772
+-1024 1:-0.334136 2:0.841915 3:0.203651 4:-0.874257 5:0.634086 6:0.623019 7:0.201672 8:-0.823872 9:0.614434 10:0.643185 11:0.205766 12:-0.51077 13:0.667619 14:0.610784 15:0.203634 16:-0.440875 17:0.654192 18:0.63429 19:0.68531 20:0.851433 21:0.842345 22:-0.828018 23:0.680185 24:0.575189 25:0.825975 26:-0.774729 27:0.681238 28:0.613491 29:0.849677 30:-0.100841 31:0.708192 32:0.674664 33:0.854773 34:-0.0440247 35:0.690509 36:0.701067
+-300.4612768279848 1:-0.500542 2:0.191605 3:-0.0495304 4:-0.468371 5:-0.295117 6:-0.155819 7:-0.053104 8:-0.32627 9:-0.325949 10:-0.128034 11:-0.0213656 12:-0.333805 13:-0.240061 14:-0.291295 15:-0.0221447 16:-0.152239 17:-0.281298 18:-0.247117 19:0.486756 20:0.1386 21:0.381168 22:-0.16979 23:-0.302923 24:6.02355e-05 25:0.482707 26:-0.141143 27:-0.348012 28:-0.100493 29:0.607639 30:-0.135616 31:-0.243749 32:-0.351746 33:0.589805 34:-0.0537229 35:-0.259328 36:-0.322113
+1024 1:-0.56222 2:-0.126418 3:-0.185312 4:-0.137803 5:-0.650826 6:-0.396504 7:-0.176243 8:-0.167601 9:-0.628014 10:-0.447875 11:-0.161606 12:0.0186635 13:-0.592363 14:-0.523008 15:-0.159441 16:-0.0592362 17:-0.574888 18:-0.546284 19:0.201412 20:-0.124958 21:0.15764 22:0.056112 23:-0.616262 24:-0.287494 25:0.160898 26:-0.0450547 27:-0.566411 28:-0.368244 29:0.250953 30:0.190458 31:-0.548421 32:-0.503478 33:0.258596 34:0.171488 35:-0.542822 36:-0.507228
+-1024 1:-0.432448 2:0.668118 3:0.0904944 4:-0.641427 5:0.289537 6:0.449985 7:0.0926816 8:-0.72222 9:0.328763 10:0.38053 11:0.0990582 12:-0.457786 13:0.388085 14:0.331584 15:0.098371 16:-0.429956 17:0.379235 18:0.326135 19:0.597896 20:0.628992 21:0.696459 22:-0.482436 23:0.261856 24:0.478616 25:0.697247 26:-0.707725 27:0.387372 28:0.296527 29:0.755409 30:-0.104023 31:0.378175 32:0.317497 33:0.750313 34:-0.125794 35:0.390326 36:0.309427
+-352.2471126689835 1:1 2:0.997504 3:0.987269 4:-0.992119 5:0.992848 6:0.985878 7:0.977352 8:-0.985817 9:0.99324 10:0.998318 11:0.986281 12:-0.541337 13:0.991053 14:0.991474 15:0.990847 16:-0.495959 17:0.97717 18:0.995547 19:0.88886 20:0.994439 21:0.964703 22:-0.964337 23:0.975191 24:0.774292 25:0.964241 26:-0.97862 27:0.983689 28:0.735362 29:0.987261 30:-0.0131665 31:0.909831 32:0.977926 33:0.969426 34:-0.0672423 35:0.944918 36:0.95831
+1024 1:-0.474464 2:0.394386 3:-0.0113428 4:-0.0253409 5:-0.246643 6:0.373895 7:0.0134188 8:-0.235796 9:-0.170503 10:0.178452 11:0.0289387 12:-0.285174 13:-0.02085 14:-0.0141374 15:0.0251469 16:-0.238577 17:-0.0314095 18:-0.0090518 19:0.469273 20:0.47302 21:0.51999 22:-0.14872 23:-0.0137548 24:0.522879 25:0.556605 26:-0.527905 27:0.127832 28:0.156948 29:0.640761 30:-0.139282 31:0.172864 32:0.0720414 33:0.617831 34:-0.128696 35:0.183019 36:0.086969
+1024 1:-0.474671 2:0.649845 3:0.0452334 4:-0.75606 5:0.30939 6:0.340216 7:0.0530502 8:-0.714171 9:0.296282 10:0.343641 11:0.0533271 12:-0.459914 13:0.362327 14:0.297839 15:0.0465029 16:-0.354563 17:0.339084 18:0.333771 19:0.564804 20:0.597817 21:0.665871 22:-0.579066 23:0.277351 24:0.359666 25:0.704399 26:-0.619997 27:0.287382 28:0.275781 29:0.735027 30:-0.0304373 31:0.30464 32:0.296527 33:0.724835 34:-0.0148415 35:0.301419 36:0.301695
+1024 1:-0.46101 2:0.244427 3:-0.0184149 4:-0.489564 5:-0.227238 6:-0.0823876 7:-0.0403653 8:-0.331984 9:-0.246035 10:-0.0199574 11:-0.00307313 12:-0.315776 13:-0.170629 14:-0.203365 15:-0.00231271 16:-0.207963 17:-0.192763 18:-0.176681 19:0.592901 20:0.20811 21:0.517637 22:-0.311605 23:-0.213381 24:-0.0248321 25:0.513697 26:-0.182834 27:-0.250351 28:0.0097778 29:0.630569 30:-0.1559 31:-0.139694 32:-0.258496 33:0.658595 34:-0.121134 35:-0.156251 36:-0.254725
+996.5984514536548 1:-0.334136 2:0.778425 3:0.196579 4:-0.831566 5:0.502275 6:0.51008 7:0.197426 8:-0.838102 9:0.51382 10:0.502859 11:0.199668 12:-0.473825 13:0.54288 14:0.499693 15:0.199058 16:-0.452871 17:0.538661 18:0.494316 19:0.630989 20:0.72517 21:0.750577 22:-0.718553 23:0.471433 24:0.441556 25:0.76161 26:-0.698274 27:0.472906 28:0.43105 29:0.773244 30:0.0105002 31:0.486904 32:0.529417 33:0.793626 34:0.0326497 35:0.462872 36:0.517888
+1024 1:-0.28736 2:0.757676 3:0.240427 4:-0.838498 5:0.49507 6:0.49765 7:0.235643 8:-0.810014 9:0.494251 10:0.511834 11:0.240828 12:-0.455023 13:0.521755 14:0.493119 15:0.237198 16:-0.426359 17:0.517721 18:0.493925 19:0.607887 20:0.685013 21:0.741166 22:-0.706298 23:0.423096 24:0.391907 25:0.730621 26:-0.686176 27:0.431323 28:0.387501 29:0.755409 30:0.0733564 31:0.400145 32:0.486343 33:0.74267 34:0.0305617 35:0.424795 36:0.473315
+1024 1:-0.326064 2:0.72925 3:0.188093 4:-0.739105 5:0.435658 6:0.533453 7:0.187518 8:-0.786413 9:0.462749 10:0.492288 11:0.198144 12:-0.48168 13:0.502772 14:0.448775 15:0.197532 16:-0.445087 17:0.496095 18:0.452515 19:0.761485 20:0.708658 21:0.778813 22:-0.617445 23:0.445621 24:0.5563 25:0.792599 26:-0.494394 27:0.398261 28:0.611683 29:0.849677 30:-0.207374 31:0.548877 32:0.416327 33:0.842034 34:-0.152252 35:0.52401 36:0.431115
+1024 1:1 2:0.995161 3:0.978782 4:-0.985089 5:0.995622 6:0.997483 7:0.978767 8:-0.992458 9:0.996264 10:0.994757 11:0.986281 12:-0.516609 13:0.978363 14:0.997229 15:0.983219 16:-0.507621 17:0.985843 18:0.995487 19:0.894479 20:0.994495 21:0.964703 22:-0.978142 23:0.975066 24:0.752698 25:0.959473 26:-0.963107 27:0.988192 28:0.763126 29:0.964331 30:-0.0719064 31:0.949792 32:0.965364 33:0.97707 34:-0.0686208 35:0.954033 36:0.967083
+1024 1:-0.466185 2:0.0704603 3:-0.0509448 4:-0.00869629 5:-0.546936 6:-0.115136 7:-0.0502731 8:0.0705332 9:-0.543308 10:-0.13368 11:-0.0488043 12:-0.017812 13:-0.396983 14:-0.300234 15:-0.0435021 16:0.01338 17:-0.416507 18:-0.316107 19:0.499243 20:0.18206 21:0.428225 22:-0.0952527 23:-0.289045 24:0.121199 25:0.411193 26:-0.205372 27:-0.233263 28:0.00499101 29:0.475156 30:0.194324 31:-0.198196 32:-0.0999626 33:0.536301 34:-0.163289 35:-0.124374 36:-0.251857
+1024 1:-0.413614 2:0.73757 3:0.107467 4:-0.771403 5:0.45087 6:0.506756 7:0.112497 8:-0.719593 9:0.44017 10:0.526386 11:0.121924 12:-0.463286 13:0.493592 14:0.446894 15:0.12278 16:-0.472018 17:0.498841 18:0.429264 19:0.733388 20:0.751049 21:0.799989 22:-0.70175 23:0.519681 24:0.53828 25:0.787832 26:-0.678546 27:0.560979 28:0.581537 29:0.834391 30:-0.169449 31:0.599403 32:0.499716 33:0.857321 34:-0.193181 35:0.593243 36:0.470121
+-1024 1:-0.270389 2:0.751741 3:0.224869 4:-0.778172 5:0.460384 6:0.522587 7:0.221489 8:-0.762041 9:0.467336 10:0.526784 11:0.233206 12:-0.374495 13:0.482881 14:0.510007 15:0.232621 16:-0.396198 17:0.492208 18:0.488095 19:0.680939 20:0.707196 21:0.759989 22:-0.620821 23:0.413761 24:0.50385 25:0.752075 26:-0.650896 27:0.44349 28:0.453993 29:0.783435 30:0.098465 31:0.430434 32:0.541975 33:0.796174 34:-0.0634372 35:0.48665 36:0.463295
+-1024 1:-0.521446 2:0.0299071 3:-0.120251 4:-0.018475 5:-0.570771 6:-0.162028 7:-0.123874 8:0.193097 9:-0.592444 10:-0.11326 11:-0.0731942 12:-0.080823 13:-0.45059 14:-0.398871 15:-0.0755384 16:-0.0487269 17:-0.453047 18:-0.394465 19:0.26947 20:0.180446 21:0.317638 22:-0.153689 23:-0.305083 24:0.0138943 25:0.287238 26:0.0197362 27:-0.329763 28:0.114579 29:0.40127 30:-0.064576 31:-0.175132 32:-0.245356 33:0.396175 34:-0.0194286 35:-0.193443 36:-0.237355
+78.49123293906621 1:-0.494954 2:-0.0244337 3:-0.148537 4:0.621146 5:-0.7379 6:0.130749 7:-0.020551 8:-0.519019 9:-0.498145 10:-0.496135 11:-2.43899e-05 12:-0.352625 13:-0.465289 14:-0.542111 15:-0.00688931 16:-0.424915 17:-0.44417 18:-0.56254 19:0.333157 20:0.0428787 21:0.301168 22:0.716247 23:-0.676542 24:0.424279 25:0.353984 26:-0.798351 27:-0.19644 28:-0.541923 29:0.480252 30:-0.479541 31:-0.214593 32:-0.513047 33:0.464965 34:-0.536705 35:-0.195682 36:-0.526612
+1024 1:-0.473636 2:0.358755 3:-0.00851402 4:-0.211508 5:-0.21673 6:0.214124 7:0.00634173 8:-0.478717 9:-0.11629 10:0.0301993 11:0.0335118 12:-0.350271 13:-0.0564379 14:-0.0948431 15:0.028198 16:-0.318891 17:-0.0607044 18:-0.0935027 19:0.48738 20:0.325811 21:0.508225 22:-0.0144467 23:-0.226449 24:0.34516 25:0.508929 26:-0.520156 27:0.000966227 28:-0.0130304 29:0.638213 30:-0.228938 31:0.00730826 32:-0.157131 33:0.645856 34:-0.279651 35:0.0281857 36:-0.17039
+-1024 1:-0.519997 2:-0.00275068 3:-0.128737 4:0.317591 5:-0.680381 6:-0.0510708 7:-0.0899045 8:-0.0548404 9:-0.581605 10:-0.295141 11:-0.0594749 12:-0.0578064 13:-0.494977 14:-0.439956 15:-0.0557064 16:-0.0440619 17:-0.498075 18:-0.445558 19:0.326913 20:0.123845 21:0.315285 22:0.276832 23:-0.491668 24:0.23904 25:0.375438 26:-0.3766 27:-0.271803 28:-0.243957 29:0.444583 30:-0.00577974 31:-0.266999 32:-0.304128 33:0.480252 34:-0.176123 35:-0.2308 36:-0.366596
+-249.2559332204016 1:-0.425411 2:0.397602 3:0.0282605 4:-0.335032 5:-0.132169 6:0.208116 7:0.0148342 8:-0.095853 9:-0.167917 10:0.319532 11:0.0670464 12:-0.224892 13:-0.0270089 14:0.0193711 15:0.0663348 16:-0.284843 17:-0.0091423 18:-0.0116648 19:0.59977 20:0.443506 21:0.623518 22:-0.333782 23:-0.00179894 24:0.28126 25:0.520848 26:-0.0215455 27:-0.0456389 28:0.564308 29:0.694262 30:-0.163843 31:0.153002 32:0.0358094 33:0.722288 34:-0.27662 35:0.185924 36:-0.0110104
+153.7865247448913 1:1 2:0.979843 3:0.844407 4:-0.989425 5:0.950967 6:0.928799 7:0.830143 8:-0.965358 9:0.94538 10:0.952668 11:0.841458 12:-0.527079 13:0.949752 14:0.950768 15:0.841337 16:-0.501213 17:0.943431 18:0.94718 19:0.86326 20:0.988704 21:0.94588 22:-0.938772 23:0.948443 24:0.776356 25:0.94517 26:-0.959134 27:0.963057 28:0.7345 29:0.959235 30:-0.128351 31:0.958598 32:0.921501 33:0.956687 34:-0.089436 35:0.940358 36:0.932148
+-690.5081584148797 1:1 2:0.969421 3:0.724177 4:-0.960167 5:0.904804 6:0.899619 7:0.722566 8:-0.957058 9:0.903852 10:0.903064 11:0.730174 12:-0.524324 13:0.904975 14:0.897549 15:0.728443 16:-0.515704 17:0.907645 18:0.889787 19:0.881367 20:0.973843 21:0.957644 22:-0.971751 23:0.934481 24:0.708867 25:0.938019 26:-0.952133 27:0.95731 28:0.736273 29:0.956687 30:-0.0442764 31:0.902577 32:0.93982 33:0.949044 34:-0.0493454 35:0.916043 36:0.94331
+1024 1:-0.215127 2:0.828219 3:0.284277 4:-0.75831 5:0.5739 6:0.69578 7:0.285185 8:-0.82968 9:0.611328 10:0.64176 11:0.294184 12:-0.460479 13:0.638606 14:0.624108 15:0.293646 16:-0.45123 17:0.638343 18:0.614163 19:0.7209 20:0.836023 21:0.818813 22:-0.656415 23:0.609572 24:0.735998 25:0.833127 26:-0.846431 27:0.694709 28:0.531687 29:0.842034 30:-0.0691354 31:0.693037 32:0.686242 33:0.854773 34:-0.0625262 35:0.68335 36:0.676946
+-481.5779039562395 1:-0.507579 2:0.253049 3:-0.0650888 4:-0.0834783 5:-0.36268 6:0.120439 7:-0.0403653 8:-0.320414 9:-0.284839 10:-0.0646999 11:-0.0244143 12:-0.172976 13:-0.213397 14:-0.169162 15:-0.0221447 16:-0.159997 17:-0.214735 18:-0.174079 19:0.430561 20:0.262137 21:0.435284 22:0.111609 23:-0.330462 24:0.329754 25:0.511313 26:-0.427692 27:-0.140891 28:-0.104393 29:0.566875 30:0.0366319 31:-0.136489 32:-0.135383 33:0.57197 34:-0.100279 35:-0.0977115 36:-0.184439
+-1024 1:-0.456871 2:0.646027 3:0.0537199 4:-0.601333 5:0.256659 6:0.450267 7:0.0643734 8:-0.719133 9:0.295653 10:0.339053 11:0.0700952 12:-0.462469 13:0.36077 14:0.295582 15:0.0709114 16:-0.34836 17:0.326515 18:0.325835 19:0.54295 20:0.631882 21:0.658812 22:-0.453569 23:0.275307 24:0.538079 25:0.680561 26:-0.743573 27:0.386225 28:0.24707 29:0.724835 30:-0.114817 31:0.38455 32:0.314699 33:0.712096 34:-0.0692899 35:0.376535 36:0.338607
+-1024 1:-0.461424 2:0.690865 3:0.0706927 4:-0.726821 5:0.341858 6:0.417027 7:0.0672042 8:-0.670545 9:0.338455 10:0.443479 11:0.0700952 12:-0.422631 13:0.408291 14:0.378139 15:0.0724369 16:-0.368079 17:0.387932 18:0.380776 19:0.569799 20:0.674284 21:0.722342 22:-0.667898 23:0.381507 24:0.386792 25:0.687712 26:-0.535198 27:0.357983 28:0.485752 29:0.752861 30:-0.0820965 31:0.434915 32:0.395361 33:0.74267 34:-0.013729 35:0.40818 36:0.418463
+-37.80360415825366 1:-0.477775 2:0.521143 3:0.0268461 4:-0.599229 5:0.0915241 6:0.232669 7:0.0445577 8:-0.738191 9:0.137163 10:0.121636 11:0.0426565 12:-0.259994 13:0.125841 14:0.172121 15:0.0434518 16:-0.471463 17:0.192528 18:0.0865871 19:0.509858 20:0.448516 21:0.614106 22:-0.434699 23:0.0341667 24:0.199068 25:0.616198 26:-0.752406 27:0.178417 28:-0.0355248 29:0.67643 30:0.0381382 31:0.0811063 32:0.104619 33:0.684073 34:-0.06721 35:0.10738 36:0.0543367
+1024 1:-0.483156 2:0.372812 3:-0.0141716 4:-0.360981 5:-0.155586 6:0.145675 7:-0.00215071 8:-0.297416 9:-0.158876 10:0.136757 11:0.0274143 12:-0.273818 13:-0.0585975 14:-0.0500562 15:0.0220958 16:-0.256015 17:-0.0498654 18:-0.0416811 19:0.516726 20:0.358111 21:0.599988 22:-0.418359 23:-0.0636435 24:0.0742252 25:0.539919 26:-0.198798 27:-0.099737 28:0.232102 29:0.666239 30:-0.139676 31:0.0296822 32:-0.0749706 33:0.605092 34:0.043017 35:-0.00272256 36:0.0129762
+-1024 1:-0.321925 2:0.682569 3:0.182435 4:-0.599636 5:0.309222 6:0.531565 7:0.190349 8:-0.612749 9:0.321798 10:0.492613 11:0.198144 12:-0.386729 13:0.401791 14:0.406143 15:0.200583 16:-0.344164 17:0.385557 18:0.404331 19:0.76648 20:0.699719 21:0.792931 22:-0.471726 23:0.351753 24:0.638426 25:0.804519 26:-0.328627 27:0.294281 28:0.69386 29:0.829295 30:-0.163246 31:0.530927 32:0.432949 33:0.834391 34:-0.158162 35:0.517771 36:0.419523
+951.6321342955517 1:1 2:0.980257 3:0.824605 4:-0.977742 5:0.955542 6:0.947485 7:0.835804 8:-0.979006 9:0.946932 10:0.93982 11:0.836885 12:-0.54052 13:0.955709 14:0.946371 15:0.830658 16:-0.511458 17:0.952098 18:0.948111 19:0.872001 20:0.978659 21:0.941174 22:-0.964886 23:0.944634 24:0.73019 25:0.930867 26:-0.90952 27:0.944784 28:0.781541 29:0.941398 30:-0.0264338 31:0.906164 32:0.960234 33:0.959235 34:-0.0716439 35:0.927081 36:0.934589
+-1024 1:-0.464529 2:0.321462 3:-0.015586 4:-0.340779 5:-0.197927 6:0.10556 7:0.00209551 8:-0.352287 9:-0.161425 10:0.0824274 11:0.0411321 12:-0.329392 13:-0.1071 14:-0.138248 15:0.0419263 16:-0.422727 17:-0.0768554 18:-0.173771 19:0.609135 20:0.321242 21:0.583518 22:-0.18924 23:-0.160657 24:0.219634 25:0.535151 26:-0.292441 27:-0.0498871 28:0.190774 29:0.694262 30:-0.178221 31:0.00926526 32:-0.120377 33:0.735027 34:-0.364355 35:0.0511497 36:-0.197859
+-1024 1:-0.379049 2:0.707151 3:0.144243 4:-0.733389 5:0.375755 6:0.459753 7:0.135145 8:-0.666693 9:0.367847 10:0.491774 11:0.150887 12:-0.419624 13:0.433086 14:0.414065 15:0.150239 16:-0.38697 17:0.426739 18:0.415673 19:0.630364 20:0.693995 21:0.731754 22:-0.647462 23:0.408871 24:0.455109 25:0.737772 26:-0.540429 27:0.371686 28:0.503215 29:0.770696 30:0.0212219 31:0.42908 32:0.474896 33:0.763052 34:-0.0209684 35:0.442047 36:0.449643
+-1024 1:-0.301227 2:0.804655 3:0.22204 4:-0.797833 5:0.540213 6:0.59971 7:0.229982 8:-0.836389 9:0.556498 10:0.560839 11:0.230157 12:-0.478152 13:0.595358 14:0.55732 15:0.220415 16:-0.447445 17:0.593178 18:0.561343 19:0.705291 20:0.780859 21:0.804695 22:-0.665911 23:0.529039 24:0.605282 25:0.830743 26:-0.772622 27:0.577966 28:0.479227 29:0.842034 30:-0.0300265 31:0.577019 32:0.594734 33:0.834391 34:-0.0261117 35:0.581247 36:0.597838
+-1024 1:-0.491642 2:0.186233 3:-0.0594312 4:-0.424269 5:-0.299635 6:-0.121887 7:-0.0191356 8:-0.490564 9:-0.290559 10:-0.211301 11:-2.43899e-05 12:-0.307374 13:-0.259213 14:-0.29687 15:-0.00688931 16:-0.299786 17:-0.253016 18:-0.297774 19:0.510482 20:0.138381 21:0.430578 22:-0.102528 23:-0.350993 24:0.00509994 25:0.506545 26:-0.353917 27:-0.271337 28:-0.210444 29:0.645856 30:-0.113021 31:-0.259363 32:-0.353725 33:0.628022 34:-0.128301 35:-0.254607 36:-0.35886
+1024 1:-0.465978 2:0.3071 3:-0.0368007 4:0.0612389 5:-0.330794 6:0.331178 7:0.0488039 8:-0.660992 9:-0.151487 10:-0.165805 11:0.0624733 12:-0.400891 13:-0.122619 14:-0.196157 15:0.0571816 16:-0.426077 17:-0.109131 18:-0.20946 19:0.55294 20:0.290582 21:0.531755 22:0.330528 23:-0.38092 24:0.546622 25:0.635268 26:-0.844779 27:0.0162848 28:-0.320705 29:0.717192 30:-0.323686 31:-0.0130573 32:-0.233113 33:0.701905 34:-0.391127 35:0.0118567 36:-0.252355
+1024 1:-0.406991 2:0.716138 3:0.121613 4:-0.799426 5:0.422839 6:0.440669 7:0.130899 8:-0.769949 9:0.405498 10:0.431957 11:0.132594 12:-0.418929 13:0.444141 14:0.425824 15:0.133458 16:-0.436381 17:0.454086 18:0.407541 19:0.566053 20:0.65984 21:0.71293 22:-0.68223 23:0.373194 24:0.354155 25:0.723469 26:-0.714653 27:0.398362 28:0.305258 29:0.732479 30:-0.0103794 31:0.391499 32:0.406732 33:0.727383 34:-0.00212022 35:0.393559 36:0.411779
+420.317278125237 1:-0.408646 2:0.79219 3:0.131514 4:-0.724997 5:0.484601 6:0.607556 7:0.132314 8:-0.871695 9:0.555309 10:0.510807 11:0.140216 12:-0.499144 13:0.579521 14:0.516543 15:0.136509 16:-0.477427 17:0.577106 18:0.513234 19:0.689681 20:0.797764 21:0.814107 22:-0.578485 23:0.500341 24:0.69831 25:0.830743 26:-0.935136 27:0.664392 28:0.37326 29:0.852225 30:-0.152501 31:0.648478 32:0.56631 33:0.834391 34:-0.182427 35:0.663059 36:0.550868
+-1024 1:0.0475215 2:0.816828 3:0.42148 4:-0.81542 5:0.583725 6:0.651123 7:0.415409 8:-0.786271 9:0.586141 10:0.667988 11:0.428335 12:-0.485833 13:0.635413 14:0.608572 15:0.427899 16:-0.481237 17:0.641637 18:0.602228 19:0.754617 20:0.802129 21:0.83058 22:-0.723869 23:0.58153 24:0.594781 25:0.828359 26:-0.562146 27:0.52896 28:0.707971 29:0.834391 30:-0.0208433 31:0.62324 32:0.652943 33:0.880251 34:-0.122537 35:0.6426 36:0.582018
+1024 1:-0.280944 2:0.704615 3:0.216382 4:-0.624618 5:0.350291 6:0.559142 7:0.215828 8:-0.626547 9:0.365836 10:0.537882 11:0.236255 12:-0.446238 13:0.443369 14:0.411291 15:0.22957 16:-0.429221 17:0.452109 18:0.417347 19:0.681564 20:0.735442 21:0.750577 22:-0.490873 23:0.415747 24:0.70208 25:0.768759 26:-0.60213 27:0.465187 28:0.553389 29:0.806365 30:-0.156738 31:0.567975 32:0.476368 33:0.808913 34:-0.148609 35:0.563181 36:0.474015
+-1024 1:-0.438243 2:0.753376 3:0.0975664 4:-0.818874 5:0.477533 6:0.48316 7:0.101174 8:-0.802871 9:0.472119 10:0.477231 11:0.100583 12:-0.516426 13:0.535862 14:0.451441 15:0.098371 16:-0.404494 17:0.499198 18:0.480421 19:0.624745 20:0.728157 21:0.743518 22:-0.67537 23:0.470496 24:0.502003 25:0.766375 26:-0.739274 27:0.499031 28:0.411574 29:0.801269 30:-0.035778 31:0.496182 32:0.501 33:0.803817 34:-0.0720793 35:0.503069 36:0.473884
+-177.011373451972 1:-0.371391 2:0.656489 3:0.147072 4:-0.744998 5:0.322973 6:0.379993 7:0.149299 8:-0.706338 9:0.318316 10:0.388876 11:0.158509 12:-0.404671 13:0.358709 14:0.341687 15:0.142612 16:-0.376743 17:0.368782 18:0.357247 19:0.524219 20:0.556369 21:0.623518 22:-0.542764 23:0.202481 24:0.298934 25:0.618582 26:-0.523151 27:0.213808 28:0.289713 29:0.67643 30:0.0023562 31:0.251862 32:0.263739 33:0.666239 34:0.013463 35:0.252568 36:0.26974
+-1024 1:-0.488952 2:0.105182 3:-0.0848905 4:0.014583 5:-0.50318 6:-0.0152503 7:-0.0177202 8:-0.323664 9:-0.43325 10:-0.275419 11:0.0060731 12:-0.341252 13:-0.343096 14:-0.406087 15:0.000738359 16:-0.104973 17:-0.38985 18:-0.347283 19:0.469273 20:0.143072 21:0.407048 22:0.176458 23:-0.437122 24:0.209931 25:0.516081 26:-0.490489 27:-0.24551 28:-0.310768 29:0.599996 30:-0.217282 31:-0.210993 32:-0.365663 33:0.57197 34:-0.085518 35:-0.238706 36:-0.319787
+1024 1:-0.421479 2:0.722043 3:0.115954 4:-0.715984 5:0.383561 6:0.487215 7:0.116743 8:-0.764952 9:0.41066 10:0.442041 11:0.118875 12:-0.476901 13:0.464896 14:0.404326 15:0.116677 16:-0.421369 17:0.452687 18:0.416236 19:0.620374 20:0.697089 21:0.743518 22:-0.583625 23:0.388396 24:0.52039 25:0.733004 26:-0.72289 27:0.47002 28:0.390763 29:0.780887 30:-0.105787 31:0.477822 32:0.422505 33:0.778339 34:-0.129962 35:0.486361 36:0.407732
+1024 1:-0.487296 2:0.0386758 3:-0.100449 4:0.312677 5:-0.637405 6:0.0306245 7:-0.0559348 8:-0.193788 9:-0.502058 10:-0.277353 11:0.00454873 12:-0.325438 13:-0.414616 14:-0.475612 15:-0.00231271 16:-0.274496 17:-0.420588 18:-0.468354 19:0.372493 20:0.101698 21:0.312932 22:0.266457 23:-0.483482 24:0.242082 25:0.327762 26:-0.39185 27:-0.233896 28:-0.210987 29:0.510823 30:-0.376529 31:-0.199187 32:-0.445007 33:0.513371 34:-0.443367 35:-0.177285 36:-0.461134
+1024 1:-0.45149 2:0.700857 3:0.0706927 4:-0.770923 5:0.386965 6:0.42309 7:0.0771121 8:-0.797739 9:0.40558 10:0.397478 11:0.0792414 12:-0.431015 13:0.428586 14:0.395312 15:0.075488 16:-0.413551 17:0.430864 18:0.394607 19:0.569799 20:0.671664 21:0.705871 22:-0.633694 23:0.375349 24:0.424967 25:0.713934 26:-0.719636 27:0.44122 28:0.354426 29:0.74267 30:-0.0310334 31:0.414758 32:0.415253 33:0.732479 34:-0.0579472 35:0.432147 36:0.407724
+690.1428185714598 1:-0.339104 2:0.743094 3:0.209309 4:-0.868084 5:0.445981 6:0.400777 7:0.207334 8:-0.868161 9:0.450134 10:0.395377 11:0.202717 12:-0.496885 13:0.474574 14:0.406342 15:0.197532 16:-0.479862 17:0.473601 18:0.400956 19:0.714032 20:0.724119 21:0.823519 22:-0.803588 23:0.512794 24:0.385738 25:0.806902 26:-0.796345 27:0.526258 28:0.376647 29:0.842034 30:-0.0108949 31:0.488473 32:0.514538 33:0.84713 34:-0.0744816 35:0.502255 36:0.472203
+-1024 1:-0.250105 2:0.742962 3:0.251744 4:-0.766125 5:0.466216 6:0.5464 7:0.241306 8:-0.692283 9:0.455983 10:0.587653 11:0.262171 12:-0.530163 13:0.538138 14:0.456074 15:0.263133 16:-0.427247 17:0.504409 18:0.479847 19:0.754617 20:0.704969 21:0.783519 22:-0.619112 23:0.435404 24:0.53973 25:0.759226 26:-0.458574 27:0.408392 28:0.675315 29:0.864964 30:-0.21961 31:0.526689 32:0.384099 33:0.84713 34:-0.156428 35:0.514673 36:0.418122
+1024 1:0.325072 2:0.910155 3:0.523322 4:-0.883804 5:0.775686 6:0.817795 7:0.530062 8:-0.892804 9:0.77287 10:0.796527 11:0.533521 12:-0.501056 13:0.799463 14:0.787453 15:0.534691 16:-0.481468 17:0.794907 18:0.78079 19:0.849524 20:0.91737 21:0.922348 22:-0.779211 23:0.757745 24:0.765051 25:0.914178 26:-0.90625 27:0.849239 28:0.659376 29:0.933755 30:-0.0657038 31:0.821727 32:0.831244 33:0.933755 34:-0.1083 35:0.837425 36:0.803585
+1024 1:-0.494333 2:0.25806 3:-0.0297287 4:-0.342889 5:-0.28025 6:-0.0159853 7:0.00775714 8:-0.624353 9:-0.201197 10:-0.201153 11:-0.00917062 12:-0.0885647 13:-0.222307 14:-0.128457 15:0.000738359 16:-0.487286 17:-0.125825 18:-0.265979 19:0.409332 20:0.175446 21:0.463519 22:-0.145354 23:-0.340856 24:-0.0294301 25:0.458869 26:-0.642018 27:-0.140979 28:-0.329895 29:0.551588 30:0.0522271 31:-0.240971 32:-0.238927 33:0.566875 34:-0.11192 35:-0.202819 36:-0.298897
+-1024 1:-0.49661 2:0.0973578 3:-0.100449 4:0.186064 5:-0.55292 6:0.0689533 7:-0.102643 8:0.25161 9:-0.552543 10:0.00990379 11:-0.0305118 12:-0.220426 13:-0.36738 14:-0.372992 15:-0.0267213 16:-0.322608 17:-0.339809 18:-0.405728 19:0.38623 20:0.27531 21:0.371756 22:0.0617469 23:-0.234867 24:0.43999 25:0.399274 26:-0.135378 27:-0.195835 28:0.149954 29:0.521014 30:-0.28154 31:-0.0129226 32:-0.215242 33:0.513371 34:-0.346377 35:0.0173051 36:-0.228769
+-1024 1:-0.282186 2:0.844606 3:0.241843 4:-0.837981 5:0.616022 6:0.648529 7:0.237059 8:-0.871607 9:0.634121 10:0.619868 11:0.245401 12:-0.517225 13:0.672721 14:0.614335 15:0.241774 16:-0.468703 17:0.657835 18:0.618604 19:0.704666 20:0.838906 21:0.839992 22:-0.782982 23:0.649313 24:0.600438 25:0.81167 26:-0.841638 27:0.705194 28:0.549126 29:0.844582 30:-0.0526218 31:0.674102 32:0.680477 33:0.839486 34:-0.0559801 35:0.688576 36:0.687958
+1024 1:-0.317371 2:0.703 3:0.202237 4:-0.817633 5:0.406543 6:0.408195 7:0.194595 8:-0.770978 9:0.395383 10:0.424716 11:0.193571 12:-0.460714 13:0.442257 14:0.39657 15:0.199058 16:-0.40006 17:0.420904 18:0.402266 19:0.732139 20:0.641494 21:0.75293 22:-0.67283 23:0.387541 24:0.391656 25:0.766375 26:-0.617143 27:0.358044 28:0.383023 29:0.801269 30:-0.0237191 31:0.375458 32:0.380077 33:0.775791 34:-0.00333753 35:0.381755 36:0.399041
+1024 1:1 2:0.984318 3:0.942007 4:-0.974094 5:0.96674 6:0.973213 7:0.950458 8:-0.978627 9:0.961967 10:0.966013 11:0.946645 12:-0.527765 13:0.964057 14:0.970434 15:0.945079 16:-0.515619 17:0.96422 18:0.962858 19:0.973776 20:0.979748 21:0.985882 22:-0.947054 23:0.949299 24:0.770242 25:0.983313 26:-0.90412 27:0.935247 28:0.784136 29:1 30:-0.0626105 31:0.926613 32:0.949899 33:0.994904 34:-0.0520541 35:0.923939 36:0.95051
+-1024 1:1 2:0.995849 3:0.964637 4:-0.9894 5:0.997105 6:0.993146 7:0.961781 8:-0.990204 9:0.996743 10:0.99685 11:0.963414 12:-0.526453 13:0.988553 14:1 15:0.966437 16:-0.525137 17:0.991524 18:0.986792 19:0.876372 20:0.996918 21:0.964703 22:-0.940514 23:0.95108 24:0.779794 25:0.964241 26:-0.983284 27:0.985475 28:0.730869 29:0.961783 30:-0.0747339 31:0.958505 32:0.972071 33:0.961783 34:-0.0929348 35:0.965502 36:0.956082
+-1024 1:-0.399333 2:0.615857 3:0.130099 4:-0.784806 5:0.252427 6:0.246915 7:0.129484 8:-0.760254 9:0.2447 10:0.241817 11:0.132594 12:-0.452607 13:0.280111 14:0.217783 15:0.128882 16:-0.403452 17:0.27162 18:0.227451 19:0.607887 20:0.561696 21:0.731754 22:-0.69808 23:0.26673 24:0.190771 25:0.71155 26:-0.691514 27:0.280757 28:0.177938 29:0.74267 30:-0.0238963 31:0.251432 32:0.244398 33:0.745218 34:-0.0375834 35:0.253954 36:0.233476
+1024 1:-0.4043 2:0.730641 3:0.121613 4:-0.85645 5:0.475163 6:0.43958 7:0.118159 8:-0.8441 9:0.471574 10:0.435724 11:0.123448 12:-0.450061 13:0.48977 14:0.452838 15:0.119729 16:-0.426589 17:0.487451 18:0.451486 19:0.596023 20:0.637992 21:0.696459 22:-0.705148 23:0.383604 24:0.334468 25:0.678177 26:-0.734717 27:0.415504 28:0.296342 29:0.727383 30:0.0567059 31:0.347271 32:0.41321 33:0.722288 34:0.0724098 35:0.351538 36:0.426929
+-1024 1:-0.415683 2:0.752187 3:0.118783 4:-0.623779 5:0.393923 6:0.610829 7:0.120991 8:-0.821597 9:0.480924 10:0.470871 11:0.128021 12:-0.487432 13:0.517569 14:0.455657 15:0.128882 16:-0.430468 17:0.503964 18:0.467712 19:0.651594 20:0.773104 21:0.781166 22:-0.457889 23:0.42141 24:0.765252 25:0.778297 26:-0.905362 27:0.649337 28:0.38631 29:0.801269 30:-0.201236 31:0.639726 32:0.513783 33:0.801269 34:-0.191456 35:0.631787 36:0.509441
+-1024 1:-0.482329 2:0.154456 3:-0.0594312 4:-0.148932 5:-0.433923 6:-0.0605654 7:-0.0375345 8:-0.0540009 9:-0.43281 10:-0.0574026 11:0.013695 12:-0.229975 13:-0.310387 14:-0.31176 15:-0.00383824 16:-0.149863 17:-0.313992 18:-0.283204 19:0.464278 20:0.194588 21:0.491755 22:-0.320478 23:-0.236173 24:-0.0719513 25:0.418345 26:-0.0606245 27:-0.273498 28:0.114918 29:0.582161 30:-0.208341 31:-0.131058 32:-0.28297 33:0.485348 34:0.0239431 35:-0.166752 36:-0.179343
+-1024 1:-0.378014 2:0.652998 3:0.131514 4:-0.59375 5:0.283873 6:0.501149 7:0.139391 8:-0.83882 9:0.37054 10:0.320533 11:0.152411 12:-0.535646 13:0.395441 14:0.287244 15:0.142612 16:-0.427546 17:0.374524 18:0.325868 19:0.618501 20:0.573762 21:0.67293 22:-0.249014 23:0.128382 24:0.616666 25:0.675793 26:-0.793603 27:0.352521 28:0.141203 29:0.727383 30:-0.237082 31:0.364508 32:0.199108 33:0.729931 34:-0.230862 35:0.355896 36:0.192171
+-997.9346305898756 1:-0.445281 2:0.719933 3:0.0919088 4:-0.761929 5:0.402968 6:0.455894 7:0.0898508 8:-0.818273 9:0.427317 10:0.40469 11:0.0975339 12:-0.390786 13:0.435257 14:0.43473 15:0.0922689 16:-0.5027 17:0.481268 18:0.383746 19:0.617877 20:0.683577 21:0.759989 22:-0.682305 23:0.405109 24:0.403317 25:0.737772 26:-0.81623 27:0.47738 28:0.279168 29:0.778339 30:0.0324028 31:0.414733 32:0.468598 33:0.770696 34:-0.0583422 35:0.450049 36:0.427611
+1024 1:1 2:0.982366 3:0.803388 4:-0.97461 5:0.933703 6:0.923013 7:0.803249 8:-0.966455 9:0.934193 10:0.935809 11:0.80792 12:-0.528695 13:0.932905 14:0.92897 15:0.803197 16:-0.487338 17:0.928432 18:0.940076 19:0.867006 20:0.97998 21:0.936468 22:-0.97272 23:0.960542 24:0.738076 25:0.969008 26:-0.93508 27:0.934683 28:0.736174 29:0.951592 30:-0.0579384 31:0.924713 32:0.950947 33:0.936303 34:-0.0955226 35:0.942538 36:0.928179
+1024 1:-0.722839 2:-0.838084 3:-0.579926 4:0.136568 5:-0.995953 6:-0.908468 7:-0.504613 8:0.112534 9:-0.989924 10:-0.933926 11:-0.50611 12:0.444109 13:-0.989106 14:-0.940931 15:-0.464544 16:0.301572 17:-0.983602 18:-0.957166 19:-0.225067 20:-0.63993 21:-0.265887 22:0.51713 23:-0.970568 24:-0.684813 25:-0.237192 26:0.165101 27:-0.911253 28:-0.814162 29:-0.235671 30:0.376892 31:-0.905378 32:-0.851835 33:-0.222932 34:-0.0501677 35:-0.844729 36:-0.909026
+-1024 1:-0.633832 2:-0.675194 3:-0.390398 4:0.354778 5:-0.983825 6:-0.770809 7:-0.370151 8:0.441103 9:-0.982176 10:-0.796065 11:-0.277457 12:0.541563 13:-0.964214 14:-0.869146 15:-0.281482 16:0.564465 17:-0.964815 18:-0.867174 19:0.119618 20:-0.386171 21:0.0282296 22:0.534743 23:-0.869526 24:-0.346309 25:0.0226402 26:0.381433 27:-0.834402 28:-0.486474 29:0.235666 30:-0.00566697 31:-0.714176 32:-0.76696 33:0.20764 34:0.0197591 35:-0.711569 36:-0.755136
+-1024 1:-0.72139 2:-0.912397 3:-0.568612 4:0.00665296 5:-0.998203 6:-0.961372 7:-0.552737 8:0.104349 9:-0.997854 10:-0.963492 11:-0.513732 12:0.43178 13:-0.998079 14:-0.970287 15:-0.49658 16:0.429229 17:-0.997933 18:-0.97227 19:-0.296255 20:-0.751911 21:-0.235299 22:0.501722 23:-0.99125 24:-0.806945 25:-0.246727 26:0.540092 27:-0.988139 28:-0.816453 29:-0.17962 30:0.668988 31:-0.975175 32:-0.883302 33:-0.184715 34:0.587469 35:-0.971047 36:-0.895704
+1024 1:-0.56905 2:-0.254336 3:-0.297049 4:0.242207 5:-0.793167 6:-0.341079 7:-0.266828 8:0.314626 9:-0.78215 10:-0.371097 11:-0.190569 12:-0.0621248 13:-0.669502 14:-0.651196 15:-0.186901 16:-0.254827 17:-0.631989 18:-0.693301 19:0.153334 20:-0.0646407 21:0.103522 22:0.111545 23:-0.540871 24:-0.0768806 25:0.0488596 26:0.0213881 27:-0.462275 28:-0.122384 29:0.220379 30:-0.286623 31:-0.319061 32:-0.520468 33:0.261144 34:-0.495897 35:-0.282074 36:-0.589889
+-1024 1:-0.884502 2:-0.994138 3:-0.772283 4:-0.208157 5:-0.998276 6:-0.996808 7:-0.772122 8:-0.131348 9:-0.998429 10:-0.998172 11:-0.64635 12:0.117829 13:-0.998504 14:-0.998907 15:-0.641504 16:0.137303 17:-0.998688 18:-0.999189 19:-0.613481 20:-0.944803 21:-0.529414 22:0.211407 23:-0.999486 24:-0.979505 25:-0.573304 26:0.236118 27:-0.998193 28:-0.981685 29:-0.44204 30:0.400429 31:-0.998457 32:-0.988024 33:-0.41911 34:0.38937 35:-0.998447 36:-0.988804
+1024 1:-0.71311 2:-0.876292 3:-0.503549 4:0.071419 5:-0.997855 6:-0.945293 7:-0.507444 8:0.200808 9:-0.998126 10:-0.944489 11:-0.431417 12:0.505174 13:-0.997584 14:-0.959692 15:-0.446237 16:0.529101 17:-0.997811 18:-0.958473 19:-0.120782 20:-0.676006 21:-0.0588298 22:0.52604 23:-0.982053 24:-0.754284 25:-0.0631768 26:0.597854 27:-0.980872 28:-0.758459 29:0.0191059 30:0.526118 31:-0.947527 32:-0.874428 33:0.0598707 34:0.582696 35:-0.954094 36:-0.870343
+-1024 1:-0.439485 2:-0.489513 3:-0.114593 4:0.501636 5:-0.955792 6:-0.60618 7:-0.145104 8:0.506995 9:-0.932381 10:-0.618516 11:0.0426565 12:0.407538 13:-0.89879 14:-0.794387 15:0.0190448 16:0.475778 17:-0.900365 18:-0.779494 19:0.694052 20:-0.15036 21:0.397639 22:0.328254 23:-0.706172 24:-0.175913 25:0.2634 26:0.506351 27:-0.691406 28:-0.0236351 29:0.747766 30:-0.562931 31:-0.446519 32:-0.726064 33:0.678978 34:-0.244832 35:-0.506732 36:-0.655374
+-1024 1:-0.452731 2:-0.386838 3:-0.0749897 4:0.585546 5:-0.943076 6:-0.508862 7:-0.113966 8:0.921617 9:-0.956239 10:-0.452386 11:0.0167437 12:0.662528 13:-0.88158 14:-0.679449 15:0.0129426 16:0.539175 17:-0.868609 18:-0.704125 19:0.625369 20:0.0474142 21:0.505872 22:0.299451 23:-0.598064 24:0.028898 25:0.413577 26:0.645468 27:-0.647615 28:0.24712 29:0.625474 30:-0.0762322 31:-0.34472 32:-0.420638 33:0.6535 34:-0.297556 35:-0.294845 36:-0.48651
+-1024 1:-0.518549 2:-0.3835 3:-0.186726 4:0.620386 5:-0.937269 6:-0.445109 7:-0.130951 8:0.404748 9:-0.89756 10:-0.584837 11:-0.0930095 12:0.328027 13:-0.831645 14:-0.714249 15:-0.0862171 16:0.292268 17:-0.827859 18:-0.726205 19:0.311928 20:-0.0760408 21:0.254109 22:0.538103 23:-0.721059 24:0.0544379 25:0.33253 26:-0.0167807 27:-0.570125 28:-0.342437 29:0.37834 30:0.0293659 31:-0.473651 32:-0.50361 33:0.403818 34:-0.185676 35:-0.425416 36:-0.561517
+-1024 1:-0.775416 2:-0.997558 3:-0.547396 4:-0.234769 5:-0.998408 6:-0.999547 7:-0.607936 8:-0.146703 9:-0.998429 10:-0.999392 11:-0.364345 12:0.0953506 13:-0.998833 14:-1 15:-0.391319 16:0.116994 17:-0.998792 18:-0.999981 19:-0.501703 20:-0.990179 21:-0.694121 22:0.102805 23:-0.999309 24:-0.998429 25:-0.814066 26:0.162859 27:-0.998981 28:-0.996937 29:-0.294271 30:0.255553 31:-0.999127 32:-0.999681 33:-0.31465 34:0.255861 35:-0.999202 36:-0.999566
+-1024 1:-0.703588 2:-0.990112 3:-0.466775 4:-0.230707 5:-0.998427 6:-0.999316 7:-0.462152 8:-0.145531 9:-0.998373 10:-0.999242 11:-0.332333 12:0.105629 13:-0.998507 14:-0.99932 15:-0.344028 16:0.128922 17:-0.99856 18:-0.999324 19:-0.489838 20:-0.990386 21:-0.689415 22:0.114213 23:-0.999423 24:-0.997219 25:-0.685343 26:0.173811 27:-0.999547 28:-0.998042 29:-0.414014 30:0.297191 31:-0.999525 32:-0.998565 33:-0.398728 34:0.292961 35:-0.999572 36:-0.998615
+-1024 1:-0.736294 2:-0.801989 3:-0.562954 4:0.0972281 5:-0.989194 6:-0.894389 7:-0.562645 8:0.218722 9:-0.988367 10:-0.891884 11:-0.545743 12:0.385504 13:-0.97813 14:-0.926157 15:-0.536244 16:0.330494 17:-0.975286 18:-0.935376 19:-0.375562 20:-0.637364 21:-0.407062 22:0.255421 23:-0.928641 24:-0.740159 25:-0.4565 26:0.368251 27:-0.926734 28:-0.700639 29:-0.429301 30:0.260096 31:-0.87018 32:-0.845106 33:-0.424206 34:0.080286 35:-0.839737 36:-0.872257
+1024 1:-0.510063 2:-0.386186 3:-0.128737 4:0.501648 5:-0.927809 6:-0.508819 7:-0.157843 8:0.770571 9:-0.937854 10:-0.473173 11:-0.076243 12:0.394314 13:-0.845904 14:-0.712676 15:-0.0770639 16:0.463295 17:-0.854115 18:-0.70497 19:0.510482 20:0.0135429 21:0.409401 22:0.2246 23:-0.560507 24:0.0192252 25:0.387357 26:0.500125 27:-0.625429 28:0.117906 29:0.533753 30:-0.219723 31:-0.336613 32:-0.489816 33:0.554136 34:-0.229886 35:-0.334209 36:-0.492452
+-1024 1:-0.595336 2:-0.566756 3:-0.305535 4:0.702909 5:-0.989846 6:-0.613461 7:-0.203136 8:0.233365 9:-0.942687 10:-0.779048 11:-0.12807 12:0.332415 13:-0.919775 14:-0.846233 15:-0.138084 16:0.489491 17:-0.931657 18:-0.826465 19:0.428688 20:-0.255668 21:0.298815 22:0.97566 23:-0.902113 24:0.0113444 25:0.353984 26:-0.139434 27:-0.662292 28:-0.595668 29:0.54904 30:-0.502548 31:-0.536153 32:-0.779341 33:0.564327 34:-0.282368 35:-0.585376 36:-0.741709
+1024 1:-0.685579 2:-0.725709 3:-0.455461 4:0.235043 5:-0.984384 6:-0.825568 7:-0.462152 8:0.436932 9:-0.989615 10:-0.8213 11:-0.349101 12:0.580124 13:-0.977495 14:-0.887742 15:-0.340977 16:0.495779 17:-0.972516 18:-0.898974 19:0.0340764 20:-0.40429 21:-0.0635357 22:0.384113 23:-0.845614 24:-0.428345 25:-0.11562 26:0.409094 27:-0.833149 28:-0.448583 29:0.0904442 30:-0.0571892 31:-0.697782 32:-0.772822 33:0.0777053 34:-0.231603 35:-0.663711 36:-0.804112
+-517.0177116638291 1:-0.633419 2:-0.857414 3:-0.386155 4:0.161106 5:-0.999416 6:-0.937123 7:-0.316367 8:0.219067 9:-0.998331 10:-0.948499 11:-0.225628 12:0.549409 13:-0.997884 14:-0.959024 15:-0.252497 16:0.582775 17:-0.99825 18:-0.957046 19:0.645974 20:-0.597712 21:0.414107 22:0.778833 23:-0.98896 24:-0.656286 25:0.549454 26:0.49645 27:-0.963656 28:-0.777208 29:0.752861 30:0.544508 31:-0.939792 32:-0.865749 33:0.740122 34:0.65663 35:-0.948624 36:-0.851101
+1024 1:-0.710212 2:-0.998939 3:-0.551639 4:-0.21146 5:-0.999163 6:-0.999274 7:-0.538583 8:-0.129886 9:-0.99915 10:-0.999623 11:-0.403978 12:0.133407 13:-0.999284 14:-0.999411 15:-0.385217 16:0.146044 17:-0.999286 18:-0.999696 19:-0.428016 20:-0.975741 21:-0.543531 22:0.160043 23:-0.999751 24:-0.992405 25:-0.537547 26:0.216834 27:-0.999774 28:-0.993828 29:-0.357963 30:0.367024 31:-0.999727 32:-0.995178 33:-0.352867 34:0.360509 35:-0.999743 36:-0.995331
+1024 1:-0.488952 2:-0.359022 3:-0.113179 4:0.730993 5:-0.948249 6:-0.416407 7:-0.0658426 8:0.72769 9:-0.931307 10:-0.491337 11:-0.00612188 12:0.563641 13:-0.858662 14:-0.674027 15:-0.0160425 16:0.550214 17:-0.853688 18:-0.673741 19:0.428064 20:0.0540237 21:0.381168 22:0.536112 23:-0.650661 24:0.223363 25:0.442183 26:0.228566 27:-0.558903 28:-0.0633241 29:0.500632 30:-0.150881 31:-0.319145 32:-0.438097 33:0.495536 34:-0.126794 35:-0.323251 36:-0.42997
+-1024 1:-0.678334 2:-0.870671 3:-0.497891 4:0.120404 5:-0.999329 6:-0.94225 7:-0.490459 8:0.239783 9:-0.99924 10:-0.943539 11:-0.416173 12:0.552441 13:-0.99839 14:-0.956666 15:-0.417253 16:0.560321 17:-0.998372 18:-0.957099 19:0.291948 20:-0.626727 21:0.171757 22:0.637305 23:-0.985246 24:-0.713555 25:0.196656 26:0.681119 27:-0.982243 28:-0.730874 29:0.304454 30:0.682996 31:-0.951803 32:-0.843224 33:0.304454 34:0.675244 35:-0.951724 36:-0.844769
+-340.6498637790361 1:-0.500749 2:-0.55179 3:-0.130152 4:0.636295 5:-0.986656 6:-0.66263 7:-0.133782 8:0.783543 9:-0.985772 10:-0.675914 11:0.013695 12:0.889201 13:-0.963111 14:-0.781574 15:0.00989157 16:0.888944 17:-0.962966 18:-0.782562 19:0.689056 20:-0.0996625 21:0.576461 22:0.774087 23:-0.814056 24:0.0461655 25:0.561373 26:0.725109 27:-0.790664 28:-0.0589059 29:0.658595 30:0.163206 31:-0.55956 32:-0.52755 33:0.69681 34:0.147536 35:-0.565931 36:-0.5422
+1024 1:-0.594301 2:-0.616809 3:-0.289977 4:0.470737 5:-0.984249 6:-0.723084 7:-0.258335 8:0.512079 9:-0.97865 10:-0.760533 11:-0.160082 12:0.583912 13:-0.955701 14:-0.843819 15:-0.168594 16:0.650735 17:-0.959326 18:-0.835806 19:0.250114 20:-0.294197 21:0.19764 22:0.585095 23:-0.842712 24:-0.227434 25:0.189504 26:0.31917 27:-0.77937 28:-0.423708 29:0.385984 30:-0.102742 31:-0.636451 32:-0.725497 33:0.327384 34:0.0301828 35:-0.654243 36:-0.691459
+-1024 1:-0.657014 2:-0.830359 3:-0.456875 4:0.193088 5:-0.99868 6:-0.916758 7:-0.449414 8:0.319859 9:-0.998547 10:-0.918322 11:-0.361296 12:0.609848 13:-0.99637 14:-0.9392 15:-0.359283 16:0.614303 17:-0.996308 18:-0.940069 19:0.310055 20:-0.563555 21:0.192934 22:0.638695 23:-0.972443 24:-0.644892 25:0.210956 26:0.677714 27:-0.968141 28:-0.666205 29:0.28917 30:0.574917 31:-0.916218 32:-0.813237 33:0.281526 34:0.563348 35:-0.915213 36:-0.814653
+1024 1:-0.625347 2:-0.418262 3:-0.312607 4:0.335908 5:-0.907989 6:-0.542931 7:-0.296551 8:0.493767 9:-0.90486 10:-0.531051 11:-0.240872 12:0.149499 13:-0.809929 14:-0.745144 15:-0.273855 16:0.1946 17:-0.805258 18:-0.728131 19:0.105256 20:-0.11695 21:0.202346 22:-0.0418064 23:-0.558106 24:-0.288769 25:0.127525 26:0.242051 27:-0.600461 28:-0.12471 29:0.281526 30:-0.321221 31:-0.378204 32:-0.587599 33:0.18471 34:-0.0755135 35:-0.412213 36:-0.500323
+-326.2069321130665 1:-0.720976 2:-0.995854 3:-0.401713 4:-0.244329 5:-0.998166 6:-0.999769 7:-0.479136 8:-0.149864 9:-0.998586 10:-0.999812 11:-0.283554 12:0.100694 13:-0.998786 14:-0.999782 15:-0.273855 16:0.116165 17:-0.99881 18:-1 19:-0.519812 20:-0.996515 21:-0.65412 22:0.0947733 23:-0.999447 24:-1 25:-0.80453 26:0.157246 27:-0.999191 28:-0.998511 29:-0.31465 30:0.253894 31:-0.999339 32:-1 33:-0.332485 34:0.245856 35:-0.999283 36:-1
+1024 1:-0.677092 2:-0.830379 3:-0.451217 4:0.216294 5:-0.999607 6:-0.918826 7:-0.47772 8:0.370877 9:-0.999764 10:-0.913648 11:-0.384161 12:0.70545 13:-0.99904 14:-0.935765 15:-0.374539 16:0.700282 17:-0.998871 18:-0.937565 19:0.491126 20:-0.503458 21:0.421166 22:0.752357 23:-0.973589 24:-0.579892 25:0.3516 26:0.978603 27:-0.984141 28:-0.541211 29:0.630569 30:0.919801 31:-0.942041 32:-0.755209 33:0.620378 34:0.849191 35:-0.936762 36:-0.76744
+101.189935549926 1:-0.537383 2:-0.714216 3:-0.207942 4:0.322778 5:-0.990531 6:-0.835208 7:-0.20172 8:0.474499 9:-0.991534 10:-0.839558 11:-0.147887 12:0.700802 13:-0.981602 14:-0.877853 15:-0.11215 16:0.636398 17:-0.979672 18:-0.890575 19:0.705291 20:-0.35506 21:0.522343 22:0.715475 23:-0.92055 24:-0.385467 25:0.49701 26:0.71236 27:-0.907495 28:-0.429386 29:0.538849 30:0.572396 31:-0.805571 32:-0.65084 33:0.5949 34:0.257691 35:-0.762984 36:-0.720058
+-1024 1:-0.714766 2:-0.710025 3:-0.456875 4:0.269944 5:-0.985707 6:-0.814202 7:-0.487628 8:0.496664 9:-0.990384 10:-0.794944 11:-0.420746 12:0.493757 13:-0.965987 14:-0.879659 15:-0.409626 16:0.629195 17:-0.975146 18:-0.868785 19:-0.166368 20:-0.449197 21:-0.152946 22:0.307121 23:-0.850819 24:-0.512599 25:-0.137075 26:0.523129 27:-0.885756 28:-0.478435 29:-0.0904493 30:0.0120307 31:-0.718494 32:-0.771173 33:-0.0522324 34:0.0716762 35:-0.737978 36:-0.766474
+1024 1:-0.699448 2:-0.673714 3:-0.517693 4:0.368078 5:-0.98341 6:-0.74095 7:-0.439506 8:0.0529719 9:-0.945335 10:-0.849723 11:-0.478671 12:0.527287 13:-0.952272 14:-0.838352 15:-0.467595 16:-0.0425581 17:-0.896098 18:-0.912257 19:-0.197591 20:-0.471306 21:-0.237652 22:0.456259 23:-0.895418 24:-0.476112 25:-0.244344 26:-0.165843 27:-0.728852 28:-0.737967 29:-0.215289 30:0.190756 31:-0.76236 32:-0.74768 33:-0.174524 34:-0.301836 35:-0.668812 36:-0.842395
+-1024 1:-0.5349 2:-0.549625 3:-0.227744 4:0.561744 5:-0.977816 6:-0.646115 7:-0.156427 8:0.50621 9:-0.964097 10:-0.717956 11:-0.102156 12:0.593443 13:-0.934366 14:-0.799317 15:-0.0923177 16:0.548035 17:-0.931164 18:-0.809822 19:0.342523 20:-0.200998 21:0.272932 22:0.6926 23:-0.839755 24:-0.101025 25:0.346833 26:0.268061 27:-0.743681 28:-0.39828 29:0.393627 30:0.173493 31:-0.625614 32:-0.598189 33:0.403818 34:-0.0535697 35:-0.578317 36:-0.648968
+-1024 1:-0.860699 2:-0.92211 3:-0.671862 4:-0.078674 5:-0.996032 6:-0.96923 7:-0.682953 8:0.0380237 9:-0.997094 10:-0.969873 11:-0.638728 12:0.31363 13:-0.99649 14:-0.977991 15:-0.64303 16:0.329101 17:-0.99665 18:-0.978371 19:-0.604738 20:-0.785594 21:-0.48471 22:0.328946 23:-0.982498 24:-0.849758 25:-0.511327 26:0.376297 27:-0.981517 28:-0.861108 29:-0.462423 30:0.364897 31:-0.958242 32:-0.92588 33:-0.46497 34:0.355938 35:-0.959102 36:-0.928523
+-1024 1:-0.73857 2:-0.95126 3:-0.632259 4:-0.103522 5:-0.998225 6:-0.982959 7:-0.607936 8:-0.0134418 9:-0.998031 10:-0.984267 11:-0.56251 12:0.278458 13:-0.998227 14:-0.987648 15:-0.533193 16:0.285586 17:-0.998247 18:-0.988801 19:-0.384304 20:-0.848408 21:-0.409415 22:0.318172 23:-0.994109 24:-0.91306 25:-0.420743 26:0.363267 27:-0.991914 28:-0.915033 29:-0.393632 30:0.468884 31:-0.985159 32:-0.948251 33:-0.383441 34:0.377729 35:-0.97977 36:-0.955831
+1024 1:-0.613756 2:-0.497591 3:-0.336652 4:0.336461 5:-0.939538 6:-0.63016 7:-0.340429 8:0.522255 9:-0.944292 10:-0.623078 11:-0.307944 12:0.277772 13:-0.866119 14:-0.778431 15:-0.305891 16:0.259971 17:-0.863294 18:-0.785088 19:0.19267 20:-0.228324 21:0.0894041 22:0.232509 23:-0.732806 24:-0.329011 25:0.120374 26:0.264044 27:-0.731937 28:-0.37196 29:0.169423 30:0.00646445 31:-0.571658 32:-0.620515 33:0.159232 34:-0.0207668 35:-0.564239 36:-0.625707
+-943.1631801448273 1:-0.469703 2:-0.511123 3:-0.103278 4:0.596097 5:-0.977411 6:-0.639684 7:-0.145104 8:0.863483 9:-0.981814 10:-0.608154 11:0.0213168 12:0.874013 13:-0.949495 14:-0.750952 15:0.0190448 16:0.769096 17:-0.941747 18:-0.767841 19:0.659711 20:-0.0634902 21:0.51999 22:0.499288 23:-0.738085 24:-0.0624241 25:0.439799 26:0.826743 27:-0.778428 28:0.109792 29:0.656047 30:0.138001 31:-0.521379 32:-0.497271 33:0.671334 34:-0.0756506 35:-0.476329 36:-0.552666
+1024 1:-0.567808 2:-0.662097 3:-0.285733 4:0.44154 5:-0.989722 6:-0.768754 7:-0.228613 8:0.448116 9:-0.982807 10:-0.807963 11:-0.160082 12:0.655438 13:-0.970181 14:-0.858179 15:-0.151814 16:0.615431 17:-0.968011 18:-0.865751 19:0.356259 20:-0.311913 21:0.284697 22:0.767265 23:-0.90817 24:-0.259088 25:0.356368 26:0.472615 27:-0.852692 28:-0.468338 29:0.414009 30:0.347079 31:-0.751124 32:-0.670199 33:0.419105 34:0.136097 35:-0.71374 36:-0.711109
+-225.7334436543802 1:-0.617068 2:-0.466662 3:-0.318265 4:0.801889 5:-0.975651 6:-0.446173 7:-0.198889 8:0.257954 9:-0.909482 10:-0.689452 11:-0.172277 12:0.298425 13:-0.867201 14:-0.775563 15:-0.186901 16:0.405118 17:-0.876202 18:-0.758068 19:0.069042 20:-0.117455 21:0.171757 22:0.968795 23:-0.838307 24:0.29422 25:0.21334 26:-0.293071 27:-0.494724 28:-0.486404 29:0.248405 30:-0.494468 31:-0.341839 32:-0.638481 33:0.258596 34:-0.330488 35:-0.397853 36:-0.610198
+-1024 1:-0.718285 2:-0.916451 3:-0.565783 4:-0.0285822 5:-0.997916 6:-0.968273 7:-0.56406 8:0.0946412 9:-0.998317 10:-0.967391 11:-0.51678 12:0.367327 13:-0.997284 14:-0.975739 15:-0.495054 16:0.367977 17:-0.997196 18:-0.977523 19:-0.2157 20:-0.727471 21:-0.221181 22:0.418833 23:-0.982572 24:-0.814374 25:-0.196667 26:0.536872 27:-0.986441 28:-0.81333 29:-0.131212 30:0.550525 31:-0.964626 32:-0.894958 33:-0.108281 34:0.416836 35:-0.95475 36:-0.911681
+-1024 1:-0.749541 2:-0.947094 3:-0.637916 4:-0.0757976 5:-0.998559 6:-0.979058 7:-0.626337 8:0.0182485 9:-0.998387 10:-0.980244 11:-0.596046 12:0.332181 13:-0.99878 14:-0.983608 15:-0.575906 16:0.334117 17:-0.998659 18:-0.984915 19:-0.349959 20:-0.818378 21:-0.334121 22:0.454219 23:-0.997326 24:-0.871653 25:-0.344463 26:0.515324 27:-0.996723 28:-0.876923 29:-0.24841 30:0.723837 31:-0.993741 32:-0.919184 33:-0.230576 34:0.673479 35:-0.992522 36:-0.926509
+-211.7435703083814 1:-0.560564 2:-0.681896 3:-0.24613 4:0.490793 5:-0.996692 6:-0.797738 7:-0.249843 8:0.630372 9:-0.996168 10:-0.804518 11:-0.105205 12:0.89815 13:-0.989522 14:-0.858604 15:-0.107573 16:0.904648 17:-0.989852 18:-0.859519 19:0.72652 20:-0.260805 21:0.607047 22:0.892729 23:-0.91592 24:-0.192633 25:0.599511 26:0.877942 27:-0.903593 28:-0.269385 29:0.724835 30:0.536227 31:-0.76971 32:-0.618157 33:0.757957 34:0.51127 35:-0.770726 36:-0.630586
+-1024 1:-0.626381 2:-0.639703 3:-0.311193 4:0.366339 5:-0.981795 6:-0.765007 7:-0.317782 8:0.529336 9:-0.980641 10:-0.754644 11:-0.237823 12:0.538278 13:-0.954653 14:-0.851215 15:-0.273855 16:0.563525 17:-0.952737 18:-0.841892 19:0.169568 20:-0.30084 21:0.195287 22:0.290333 23:-0.798084 24:-0.420891 25:0.156131 26:0.57957 27:-0.836117 28:-0.309169 29:0.291718 30:-0.0676693 31:-0.630333 32:-0.708012 33:0.220379 34:0.133187 35:-0.659752 36:-0.655131
+-1024 1:-0.702553 2:-0.913471 3:-0.496477 4:0.0234192 5:-0.999466 6:-0.967598 7:-0.491875 8:0.135099 9:-0.999503 10:-0.968645 11:-0.445135 12:0.453997 13:-0.999217 14:-0.974008 15:-0.4081 16:0.451324 17:-0.999179 18:-0.976392 19:0.451166 20:-0.688829 21:0.376462 22:0.594602 23:-0.99156 24:-0.793377 25:0.365903 26:0.649249 27:-0.990064 28:-0.802646 29:0.457322 30:0.836 31:-0.979326 32:-0.866496 33:0.546492 34:0.73453 35:-0.976245 36:-0.885302
+1024 1:-0.693859 2:-0.878802 3:-0.483747 4:0.10784 5:-0.999469 6:-0.947686 7:-0.460737 8:0.207734 9:-0.99915 10:-0.951821 11:-0.390259 12:0.544534 13:-0.999074 14:-0.962313 15:-0.411151 16:0.561287 17:-0.999065 18:-0.960955 19:0.389352 20:-0.646338 21:0.317638 22:0.679871 23:-0.991261 24:-0.735501 25:0.308692 26:0.672522 27:-0.986098 28:-0.763789 29:0.543944 30:0.773966 31:-0.970064 32:-0.861366 33:0.487893 34:0.83526 35:-0.973212 36:-0.850723
+-1024 1:-0.715387 2:-0.861232 3:-0.543152 4:0.0521656 5:-0.995176 6:-0.935666 7:-0.532921 8:0.204985 9:-0.996861 10:-0.934846 11:-0.493915 12:0.431233 13:-0.992758 14:-0.953479 15:-0.48285 16:0.428272 17:-0.99256 18:-0.955935 19:-0.171364 20:-0.622847 21:-0.155299 22:0.434289 23:-0.958504 24:-0.713023 25:-0.151378 26:0.515245 27:-0.958886 28:-0.716544 29:-0.0980927 30:0.435092 31:-0.912298 32:-0.844974 33:-0.0955449 34:0.277506 35:-0.893008 36:-0.866842
+1024 1:-0.664673 2:-0.802395 3:-0.496477 4:0.168027 5:-0.993008 6:-0.888475 7:-0.457906 8:0.250277 9:-0.991559 10:-0.901185 11:-0.388735 12:0.491472 13:-0.987071 14:-0.930186 15:-0.388268 16:0.49208 17:-0.986848 18:-0.931869 19:-0.192595 20:-0.59533 21:-0.303535 22:0.529507 23:-0.954022 24:-0.597712 25:-0.29202 26:0.395446 27:-0.925808 28:-0.690039 29:-0.161785 30:0.247691 31:-0.874425 32:-0.854385 33:-0.184715 34:0.196148 35:-0.865969 36:-0.860087
+-1024 1:-0.475498 2:-0.362713 3:-0.0735753 4:0.700611 5:-0.944545 6:-0.431256 7:-0.0715042 8:0.825916 9:-0.941473 10:-0.463064 11:0.0152193 12:0.599473 13:-0.864422 14:-0.671508 15:0.00989157 16:0.587363 17:-0.862658 18:-0.676046 19:0.622247 20:0.0751837 21:0.51999 22:0.525529 23:-0.618129 24:0.27877 25:0.506545 26:0.431749 27:-0.579675 28:0.131256 29:0.620378 30:-0.193019 31:-0.275895 32:-0.41481 33:0.648404 34:-0.213811 35:-0.282453 36:-0.432052
+1024 1:-0.378014 2:-0.257494 3:-0.0863049 4:0.477171 5:-0.850706 6:-0.31104 7:-0.118213 8:0.356532 9:-0.785715 10:-0.362648 11:0.089912 12:-0.170691 13:-0.685423 14:-0.704382 15:0.09532 16:-0.00357997 17:-0.708149 18:-0.677383 19:0.67532 20:-0.000739771 21:0.404695 22:0.178162 23:-0.521748 24:0.0406238 25:0.289622 26:0.291778 27:-0.475709 28:0.18664 29:0.763052 30:-0.848986 31:-0.206742 32:-0.658925 33:0.732479 34:-0.558834 35:-0.274625 36:-0.589222
+1024 1:-0.591403 2:-0.346437 3:-0.278661 4:0.906713 5:-0.951866 6:-0.243612 7:-0.170581 8:0.074162 9:-0.825936 10:-0.621221 11:-0.153984 12:0.0375529 13:-0.761823 14:-0.723289 15:-0.165543 16:0.150282 17:-0.774091 18:-0.70378 19:0.105256 20:-0.0362233 21:0.19764 22:0.905405 23:-0.773002 24:0.43088 25:0.23241 26:-0.481313 27:-0.351896 28:-0.463272 29:0.278979 30:-0.582505 31:-0.223736 32:-0.582867 33:0.291715 34:-0.439812 35:-0.276825 36:-0.556725
+394.4110624064522 1:-0.589333 2:-0.700769 3:-0.318265 4:0.488938 5:-0.99727 6:-0.795849 7:-0.18332 8:0.353607 9:-0.985918 10:-0.858886 11:-0.0914852 12:0.601611 13:-0.979026 14:-0.894988 15:-0.118252 16:0.697505 17:-0.982781 18:-0.885022 19:0.504239 20:-0.378964 21:0.329403 22:0.955549 23:-0.950291 24:-0.249782 25:0.39689 26:0.14725 27:-0.818395 28:-0.64771 29:0.612735 30:-0.116227 31:-0.725517 32:-0.808641 33:0.617831 34:0.108276 35:-0.764741 36:-0.774637
+-1024 1:-0.71932 2:-0.93012 3:-0.540324 4:-0.080018 5:-0.997161 6:-0.976469 7:-0.581044 8:0.0431351 9:-0.99763 10:-0.973837 11:-0.486293 12:0.33994 13:-0.997992 14:-0.981588 15:-0.488952 16:0.352076 17:-0.997959 18:-0.981791 19:-0.290635 20:-0.794333 21:-0.324711 22:0.4044 23:-0.991383 24:-0.858999 25:-0.34208 26:0.489039 27:-0.992425 28:-0.861577 29:-0.194906 30:0.559024 31:-0.982387 32:-0.927089 33:-0.20255 34:0.553126 35:-0.982368 36:-0.927531
+-1024 1:-0.615205 2:-0.676232 3:-0.325337 4:0.344841 5:-0.987361 6:-0.799904 7:-0.339013 8:0.506934 9:-0.986661 10:-0.790201 11:-0.230201 12:0.598335 13:-0.969779 14:-0.869097 15:-0.235716 16:0.616464 17:-0.970485 18:-0.868466 19:0.0677938 20:-0.336234 21:0.138816 22:0.426236 23:-0.858083 24:-0.435443 25:0.0226402 26:0.666297 27:-0.872817 28:-0.314498 29:0.189806 30:0.101131 31:-0.712341 32:-0.725523 33:0.194901 34:0.134807 35:-0.721935 36:-0.722099
+1024 1:-0.519791 2:-0.540092 3:-0.175411 4:0.621024 5:-0.982761 6:-0.643004 7:-0.128121 8:0.695479 9:-0.9775 10:-0.679291 11:-0.0305118 12:0.778888 13:-0.95054 14:-0.7811 15:-0.0404511 16:0.761569 17:-0.947747 18:-0.781529 19:0.474268 20:-0.109007 21:0.411754 22:0.753279 23:-0.817577 24:0.0284061 25:0.461253 26:0.5265 27:-0.759388 28:-0.183058 29:0.533753 30:0.0848031 31:-0.553763 32:-0.559441 33:0.528658 34:0.0901051 35:-0.551892 36:-0.555167
+-1024 1:-0.63197 2:-0.424957 3:-0.355038 4:0.401106 5:-0.909255 6:-0.490847 7:-0.323444 8:0.321599 9:-0.886626 10:-0.590619 11:-0.24697 12:0.00997472 13:-0.795857 14:-0.774812 15:-0.258599 16:0.134219 17:-0.810004 18:-0.75615 19:0.152086 20:-0.153487 21:0.11764 22:0.325719 23:-0.652506 24:-0.0397605 25:0.172817 26:0.0127913 27:-0.56939 28:-0.31534 29:0.340123 30:-0.416862 31:-0.409824 32:-0.655102 33:0.291715 34:-0.274112 35:-0.431379 36:-0.612374
+-348.7588569085703 1:-0.733603 2:-0.948698 3:-0.562954 4:-0.0953977 5:-0.998453 6:-0.983519 7:-0.559814 8:0.00849296 9:-0.998637 10:-0.983912 11:-0.45733 12:0.302178 13:-0.998812 14:-0.988117 15:-0.470646 16:0.326077 17:-0.998958 18:-0.987791 19:-0.0833146 20:-0.811932 21:-0.0823593 22:0.453607 23:-0.997887 24:-0.892992 25:-0.0655606 26:0.511919 27:-0.997157 28:-0.897615 29:0.108276 30:0.688393 31:-0.994155 32:-0.937589 33:0.103181 34:0.702259 35:-0.994619 36:-0.935836
+1024 1:-0.720355 2:-0.855266 3:-0.51345 4:0.0966261 5:-0.996035 6:-0.927492 7:-0.491875 8:0.187837 9:-0.995862 10:-0.936576 11:-0.422271 12:0.488987 13:-0.994819 14:-0.952157 15:-0.453865 16:0.525316 17:-0.995309 18:-0.948877 19:-0.119533 20:-0.62615 21:-0.117654 22:0.655595 23:-0.981465 24:-0.646754 25:-0.0798634 26:0.494821 27:-0.961138 28:-0.74364 29:-0.00382424 30:0.436856 31:-0.920671 32:-0.857915 33:-0.0267544 34:0.569121 35:-0.933954 36:-0.837142
+-704.3811180848631 1:-0.722839 2:-0.924194 3:-0.578511 4:-0.0392428 5:-0.998003 6:-0.970645 7:-0.588121 8:0.040989 9:-0.996875 10:-0.971209 11:-0.557937 12:0.312031 13:-0.995992 14:-0.978168 15:-0.553025 16:0.324667 17:-0.995994 18:-0.978596 19:0.0153454 20:-0.744195 21:-0.0235354 22:0.468296 23:-0.990016 24:-0.834779 25:0.0178726 26:0.55183 27:-0.991501 28:-0.845523 29:0.0878964 30:0.626528 31:-0.977623 32:-0.906803 33:0.105728 34:0.59516 35:-0.97554 36:-0.909341
+1024 1:-0.681853 2:-0.79549 3:-0.476675 4:0.218265 5:-0.994519 6:-0.878215 7:-0.436675 8:0.274757 9:-0.992505 10:-0.898672 11:-0.36282 12:0.537262 13:-0.988483 14:-0.925857 15:-0.39437 16:0.593113 17:-0.989872 18:-0.919724 19:-0.079568 20:-0.536592 21:-0.0917711 22:0.728295 23:-0.967097 24:-0.504327 25:-0.0560254 26:0.426732 27:-0.923279 28:-0.674518 29:0.00381915 30:0.268055 31:-0.853998 32:-0.823865 33:-0.0165632 34:0.430541 35:-0.875235 36:-0.794875
+-1024 1:-0.7572 2:-0.892176 3:-0.639331 4:0.0151486 5:-0.996622 6:-0.948045 7:-0.623506 8:0.118065 9:-0.99607 10:-0.949355 11:-0.591473 12:0.381498 13:-0.994228 14:-0.963067 15:-0.582008 16:0.370976 17:-0.99384 18:-0.966154 19:-0.434886 20:-0.737858 21:-0.442357 22:0.323732 23:-0.971421 24:-0.815027 25:-0.494641 26:0.36847 27:-0.963174 28:-0.804439 29:-0.482805 30:0.330388 31:-0.930106 32:-0.896303 33:-0.467518 34:0.2013 35:-0.914477 36:-0.913344
+1024 1:-0.671296 2:-0.839676 3:-0.47809 4:0.158321 5:-0.997228 6:-0.916313 7:-0.426767 8:0.157392 9:-0.993197 10:-0.936092 11:-0.37654 12:0.45888 13:-0.991737 14:-0.948894 15:-0.37759 16:0.455203 17:-0.991309 18:-0.950457 19:-0.0377293 20:-0.582626 21:-0.0117707 22:0.745226 23:-0.98146 24:-0.585339 25:0.0560111 26:0.336111 27:-0.932947 28:-0.766314 29:0.179615 30:0.358992 31:-0.900036 32:-0.853946 33:0.146493 34:0.284302 35:-0.890647 36:-0.862622
+1024 1:-0.732775 2:-0.945017 3:-0.601142 4:-0.0448436 5:-0.999691 6:-0.980015 7:-0.571137 8:0.0546237 9:-0.999703 10:-0.982007 11:-0.510683 12:0.358004 13:-0.999613 14:-0.98556 15:-0.513361 16:0.372693 17:-0.999628 18:-0.98564 19:0.110876 20:-0.757258 21:0.0776393 22:0.482069 23:-0.992539 24:-0.847378 25:0.0917681 26:0.590015 27:-0.996111 28:-0.861358 29:0.217832 30:0.734478 31:-0.988194 32:-0.911219 33:0.217832 34:0.729564 35:-0.987891 36:-0.910805
+-1024 1:-0.713524 2:-0.805927 3:-0.485162 4:0.194426 5:-0.995911 6:-0.894825 7:-0.503198 8:0.359666 9:-0.996883 10:-0.888858 11:-0.417698 12:0.582895 13:-0.991993 14:-0.925606 15:-0.423356 16:0.633211 17:-0.993553 18:-0.922739 19:-0.165119 20:-0.568037 21:-0.115301 22:0.4562 23:-0.940408 24:-0.635354 25:-0.110853 26:0.599978 27:-0.950193 28:-0.624085 29:-0.0598758 30:0.288 31:-0.860838 32:-0.825352 33:-0.0242066 34:0.338743 35:-0.871803 36:-0.821737
+1024 1:-0.488331 2:-0.430674 3:-0.134395 4:0.509043 5:-0.940473 6:-0.544837 7:-0.140858 8:0.525883 9:-0.921003 10:-0.57504 11:-0.175326 12:0.344371 13:-0.83047 14:-0.704938 15:-0.135033 16:0.387927 17:-0.847993 18:-0.721181 19:0.515478 20:-0.0609317 21:0.359992 22:0.265796 23:-0.622171 24:-0.0597737 25:0.3516 26:0.18765 27:-0.580625 28:-0.150577 29:0.391079 30:0.153717 31:-0.430774 32:-0.389822 33:0.454774 34:-0.222784 35:-0.354263 36:-0.511006
+-1024 1:-0.701311 2:-0.755 3:-0.483747 4:0.200313 5:-0.98856 6:-0.856056 7:-0.47772 8:0.355625 9:-0.990185 10:-0.855881 11:-0.416173 12:0.488657 13:-0.977076 14:-0.905549 15:-0.412677 16:0.476726 17:-0.976392 18:-0.909583 19:-0.138267 20:-0.470012 21:-0.112948 22:0.345942 23:-0.885852 24:-0.569165 25:-0.148994 26:0.376668 27:-0.87253 28:-0.574584 29:-0.0777103 30:0.194912 31:-0.784608 32:-0.770013 33:-0.0726148 34:0.0429203 35:-0.75689 36:-0.7965
+900.772066848342 1:-0.459355 2:-0.540269 3:-0.125908 4:0.473108 5:-0.967476 6:-0.666634 7:-0.153596 8:0.532931 9:-0.953608 10:-0.669344 11:0.0335118 12:0.505053 13:-0.92775 14:-0.815878 15:0.00226389 16:0.563747 17:-0.928304 18:-0.802351 19:0.7209 20:-0.192261 21:0.409401 22:0.364221 23:-0.748373 24:-0.235832 25:0.272935 26:0.565084 27:-0.743527 28:-0.0864012 29:0.7656 30:-0.449849 31:-0.515194 32:-0.73855 33:0.684073 34:-0.139934 35:-0.569787 36:-0.671146
+1024 1:-0.797358 2:-0.599876 3:-0.568612 4:0.0983653 5:-0.932768 6:-0.731836 7:-0.568306 8:0.181839 9:-0.930039 10:-0.747081 11:-0.532024 12:0.0344944 13:-0.867361 14:-0.85593 15:-0.534718 16:0.0111244 17:-0.862423 18:-0.862601 19:-0.478598 20:-0.454467 21:-0.38118 22:0.105846 23:-0.791116 24:-0.568975 25:-0.401672 26:0.0683224 27:-0.772511 28:-0.628673 29:-0.335033 30:-0.106665 31:-0.672091 32:-0.783829 33:-0.337581 34:-0.145336 35:-0.667786 36:-0.794789
+1024 1:-0.590782 2:-0.612221 3:-0.285733 4:0.441528 5:-0.983157 6:-0.734284 7:-0.286643 8:0.638482 9:-0.984975 10:-0.725934 11:-0.185996 12:0.654986 13:-0.957724 14:-0.827757 15:-0.200631 16:0.696693 17:-0.959602 18:-0.821923 19:0.216397 20:-0.226941 21:0.221167 22:0.498015 23:-0.821101 24:-0.264253 25:0.218107 26:0.656374 27:-0.83016 28:-0.217443 29:0.286622 30:0.134771 31:-0.633493 32:-0.625314 33:0.286622 34:0.182992 35:-0.643944 36:-0.614946
+-1014.95102141596 1:-0.680197 2:-0.539937 3:-0.350796 4:0.551855 5:-0.9746 6:-0.615849 7:-0.372982 8:0.721225 9:-0.9758 10:-0.619971 11:-0.285079 12:0.721403 13:-0.940045 14:-0.761457 15:-0.286059 16:0.679494 17:-0.934049 18:-0.765344 19:0.0496865 20:-0.130942 21:0.138816 22:0.509466 23:-0.727569 24:0.012544 25:0.122758 26:0.52727 27:-0.741597 28:-0.11164 29:0.210188 30:-0.0799215 31:-0.484902 32:-0.573682 33:0.197449 34:-0.0600432 35:-0.476709 36:-0.556185
+1024 1:-0.673987 2:-0.830717 3:-0.459704 4:0.178146 5:-0.997405 6:-0.912006 7:-0.472059 8:0.298851 9:-0.997433 10:-0.914932 11:-0.375015 12:0.615226 13:-0.996379 14:-0.937792 15:-0.380641 16:0.62406 17:-0.996293 18:-0.937478 19:0.12711 20:-0.580769 21:0.115287 22:0.631734 23:-0.970052 24:-0.631755 25:0.0989195 26:0.626144 27:-0.963097 28:-0.673203 29:0.296811 30:0.531104 31:-0.920511 32:-0.833783 33:0.281526 34:0.550281 35:-0.921427 36:-0.828649
+1024 1:-0.608168 2:-0.358778 3:-0.299877 4:0.897645 5:-0.951801 6:-0.24418 7:-0.247012 8:-0.199407 9:-0.770379 10:-0.679505 11:-0.167704 12:-0.35597 13:-0.708877 14:-0.796332 15:-0.173171 16:-0.110185 17:-0.74392 18:-0.760366 19:0.354386 20:-0.129843 21:0.294109 22:0.949062 23:-0.83037 24:0.267044 25:0.325379 26:-0.445122 27:-0.456829 28:-0.555755 29:0.513371 30:-0.780088 31:-0.336344 32:-0.735396 33:0.521014 34:-0.615644 35:-0.377654 36:-0.702377
+1024 1:-0.690547 2:-0.886741 3:-0.489405 4:0.061847 5:-0.998821 6:-0.954377 7:-0.483382 8:0.185427 9:-0.998805 10:-0.953541 11:-0.420746 12:0.49599 13:-0.998396 14:-0.964933 15:-0.427932 16:0.50933 17:-0.998441 18:-0.964855 19:0.17831 20:-0.634737 21:0.141169 22:0.617274 23:-0.985391 24:-0.725969 25:0.182353 26:0.730942 27:-0.987138 28:-0.726217 29:0.30955 30:0.730193 31:-0.960549 32:-0.847464 33:0.301906 34:0.736069 35:-0.961141 36:-0.846293
+-1024 1:-0.74892 2:-0.988827 3:-0.591241 4:-0.230105 5:-0.997908 6:-0.998436 7:-0.552737 8:-0.145965 9:-0.997927 10:-0.998575 11:-0.448184 12:0.103188 13:-0.998158 14:-0.999024 15:-0.446237 16:0.12142 17:-0.998229 18:-0.999215 19:-0.4174 20:-0.979429 21:-0.632944 22:0.124588 23:-0.998994 24:-0.994212 25:-0.573304 26:0.178868 27:-0.999122 28:-0.996269 29:-0.322294 30:0.30681 31:-0.999127 32:-0.997401 33:-0.329937 34:0.300127 35:-0.999087 36:-0.99739
+-1024 1:-0.796737 2:-0.992056 3:-0.636502 4:-0.220667 5:-0.99822 6:-0.998158 7:-0.630583 8:-0.140014 9:-0.998185 10:-0.998558 11:-0.461903 12:0.116351 13:-0.99862 14:-0.999103 15:-0.438611 16:0.126743 17:-0.998647 18:-0.999516 19:-0.277522 20:-0.960004 21:-0.430592 22:0.159457 23:-0.998928 24:-0.989102 25:-0.449348 26:0.205349 27:-0.998541 28:-0.990631 29:-0.169428 30:0.355883 31:-0.998813 32:-0.993924 33:-0.115925 34:0.338993 35:-0.998723 36:-0.994634
+1024 1:-0.497231 2:0.0440045 3:-0.107521 4:0.496698 5:-0.693824 6:0.0975439 7:-0.0799967 8:0.243473 9:-0.612063 10:-0.11002 11:-0.0488043 12:-0.151193 13:-0.424279 14:-0.403219 15:-0.0572319 16:-0.123659 17:-0.422562 18:-0.397044 19:0.389976 20:0.301464 21:0.388227 22:0.0470525 23:-0.237676 24:0.412749 25:0.437415 26:-0.365132 27:-0.090104 28:0.0338562 29:0.523562 30:-0.243414 31:0.00976895 32:-0.168675 33:0.518466 34:-0.206249 35:0.00258426 36:-0.153361
+-1024 1:-0.69034 2:-0.866487 3:-0.523351 4:0.0605699 5:-0.995695 6:-0.936465 7:-0.524429 8:0.174974 9:-0.99584 10:-0.93833 11:-0.445135 12:0.464476 13:-0.994933 14:-0.955896 15:-0.446237 16:0.470549 17:-0.994833 18:-0.956633 19:-0.270652 20:-0.682467 21:-0.34118 22:0.488503 23:-0.974574 24:-0.716406 25:-0.323009 26:0.467586 27:-0.965644 28:-0.759973 29:-0.210193 30:0.398271 31:-0.932663 32:-0.883606 33:-0.220384 34:0.368039 35:-0.929969 36:-0.887504
+-1024 1:-0.704209 2:-0.902273 3:-0.489405 4:0.0451234 5:-0.999334 6:-0.962013 7:-0.508859 8:0.166383 9:-0.999259 10:-0.959974 11:-0.407027 12:0.482627 13:-0.999243 14:-0.971301 15:-0.434034 16:0.503913 17:-0.999266 18:-0.969728 19:0.484883 20:-0.673557 21:0.428225 22:0.574054 23:-0.988555 24:-0.788578 25:0.301541 26:0.730604 27:-0.991908 28:-0.766349 29:0.684073 30:0.802651 31:-0.978999 32:-0.878348 33:0.584709 34:0.882058 35:-0.981892 36:-0.863933
+-1024 1:-0.651425 2:-0.657149 3:-0.372011 4:0.5338 5:-0.993339 6:-0.730797 7:-0.27532 8:0.36165 9:-0.976035 10:-0.814781 11:-0.219531 12:0.566943 13:-0.963102 14:-0.862253 15:-0.238767 16:0.645113 17:-0.967097 18:-0.852701 19:0.000984102 20:-0.294962 21:0.110581 22:0.991516 23:-0.928789 24:-0.0522242 25:0.177585 26:0.102946 27:-0.750687 28:-0.554565 29:0.217832 30:-0.147603 31:-0.617478 32:-0.729271 33:0.215284 34:0.00709426 35:-0.655446 36:-0.703838
+1024 1:-0.656807 2:-0.738808 3:-0.449803 4:0.319792 5:-0.992376 6:-0.824342 7:-0.394213 8:0.327184 9:-0.986981 10:-0.860067 11:-0.326235 12:0.531857 13:-0.977824 14:-0.900657 15:-0.353181 16:0.610099 17:-0.980775 18:-0.891341 19:-0.0277378 20:-0.463969 21:-0.0588298 22:0.752778 23:-0.948025 24:-0.39001 25:-0.0250359 26:0.342309 27:-0.879392 28:-0.626133 29:0.0292971 30:0.124991 31:-0.785661 32:-0.795944 33:0.0114625 34:0.303481 35:-0.813544 36:-0.761939
+-1024 1:-0.635902 2:-0.705478 3:-0.333823 4:0.34399 5:-0.99034 6:-0.814561 7:-0.296551 8:0.373179 9:-0.985595 10:-0.844099 11:-0.211909 12:0.568515 13:-0.975656 14:-0.890776 15:-0.212833 16:0.561816 17:-0.974662 18:-0.892343 19:0.134603 20:-0.396257 21:0.0988159 22:0.664207 23:-0.913955 24:-0.367206 25:0.115606 26:0.308253 27:-0.83563 28:-0.556647 29:0.250953 30:0.076208 31:-0.749132 32:-0.772645 33:0.271335 34:0.0231692 35:-0.740902 36:-0.782159
+-1024 1:-0.695308 2:-0.836642 3:-0.504964 4:0.106934 5:-0.994443 6:-0.916895 7:-0.51169 8:0.198838 9:-0.99276 10:-0.918985 11:-0.475622 12:0.435838 13:-0.988518 14:-0.9419 15:-0.472171 16:0.439636 17:-0.988125 18:-0.94275 19:0.04594 20:-0.589435 21:0.0282296 22:0.538998 23:-0.964217 24:-0.670672 25:0.0345568 26:0.566765 27:-0.958615 28:-0.693098 29:0.080253 30:0.463801 31:-0.905897 32:-0.829322 33:0.0675141 34:0.411991 35:-0.897308 36:-0.832757
+-115.5397470941682 1:-0.659912 2:-0.897343 3:-0.439902 4:0.0670709 5:-0.999551 6:-0.960265 7:-0.384305 8:0.142877 9:-0.999055 10:-0.966843 11:-0.298798 12:0.468673 13:-0.998955 14:-0.973325 15:-0.318095 16:0.492362 17:-0.9991 18:-0.972447 19:0.654091 20:-0.670381 21:0.397639 22:0.686203 23:-0.993982 24:-0.758265 25:0.554221 26:0.532563 27:-0.982142 28:-0.828174 29:0.78853 30:0.665291 31:-0.970388 32:-0.892172 33:0.745218 34:0.73515 35:-0.974042 36:-0.882849
+-1024 1:-0.598026 2:-0.751256 3:-0.359281 4:0.210748 5:-0.989416 6:-0.865614 7:-0.334767 8:0.349532 9:-0.990089 10:-0.869149 11:-0.25764 12:0.450226 13:-0.977065 14:-0.918115 15:-0.258599 16:0.566738 17:-0.982453 18:-0.908986 19:0.0883981 20:-0.48755 21:-0.0917711 22:0.460323 23:-0.904006 24:-0.515962 25:0.0154888 26:0.453 27:-0.900622 28:-0.594966 29:0.126111 30:0.0759905 31:-0.798853 32:-0.824962 33:0.118467 34:0.217423 35:-0.820192 36:-0.80084
+1024 1:-0.517307 2:-0.61197 3:-0.175411 4:0.531228 5:-0.990758 6:-0.741146 7:-0.211628 8:0.759096 9:-0.992477 10:-0.721834 11:-0.0381337 12:0.944574 13:-0.979332 14:-0.811876 15:-0.0374 16:0.852905 17:-0.974232 18:-0.823094 19:0.676569 20:-0.175499 21:0.527049 22:0.647105 23:-0.838182 24:-0.176013 25:0.466021 26:0.96087 27:-0.871604 28:-0.0467322 29:0.681525 30:0.39206 31:-0.680037 32:-0.563005 33:0.686618 34:0.208805 35:-0.646852 36:-0.605477
+-1024 1:-0.64563 2:-0.789509 3:-0.363524 4:0.21644 5:-0.995732 6:-0.894483 7:-0.380059 8:0.369855 9:-0.996003 10:-0.88959 11:-0.278981 12:0.581609 13:-0.990267 14:-0.925057 15:-0.305891 16:0.601871 17:-0.989968 18:-0.921372 19:0.250738 20:-0.486234 21:0.235285 22:0.480929 23:-0.927086 24:-0.596904 25:0.21334 26:0.730902 27:-0.948763 28:-0.543502 29:0.329932 30:0.314551 31:-0.842648 32:-0.795621 33:0.273883 34:0.452968 35:-0.859083 36:-0.768717
+1024 1:-0.668606 2:-0.597687 3:-0.400299 4:0.300752 5:-0.962716 6:-0.717837 7:-0.389967 8:0.41124 9:-0.960136 10:-0.728981 11:-0.339955 12:0.373148 13:-0.920081 14:-0.831174 15:-0.344028 16:0.323975 17:-0.914464 18:-0.840424 19:-0.0920572 20:-0.321977 21:-0.0611827 22:0.175638 23:-0.764395 24:-0.451515 25:-0.0917824 26:0.165955 27:-0.732898 28:-0.464597 29:-0.0165632 30:0.0208675 31:-0.632444 32:-0.680041 33:-0.0140154 34:-0.123593 35:-0.600219 36:-0.710945
+1006.680187601833 1:-0.672538 2:-0.868313 3:-0.442731 4:0.122137 5:-0.999497 6:-0.946015 7:-0.479136 8:0.256085 9:-0.999489 10:-0.942311 11:-0.373491 12:0.579932 13:-0.999138 14:-0.957935 15:-0.368436 16:0.587919 17:-0.999135 18:-0.958522 19:0.591653 20:-0.587287 21:0.517637 22:0.657278 23:-0.981717 24:-0.701759 25:0.373055 26:0.852 27:-0.988416 28:-0.668775 29:0.714644 30:0.858531 31:-0.963273 32:-0.825255 33:0.724835 34:0.832825 35:-0.961189 36:-0.827586
+1024 1:-0.59161 2:-0.670393 3:-0.383326 4:0.275082 5:-0.977212 6:-0.785453 7:-0.30929 8:0.30663 9:-0.971732 10:-0.82094 11:-0.253067 12:0.390143 13:-0.951451 14:-0.882182 15:-0.255548 16:0.384048 17:-0.950641 18:-0.885334 19:-0.0539651 20:-0.465191 21:-0.202358 22:0.504251 23:-0.899789 24:-0.444317 25:-0.194284 26:0.275905 27:-0.845997 28:-0.596177 29:-0.0343978 30:0.0449531 31:-0.771266 32:-0.809925 33:-0.0598758 34:-0.0126649 35:-0.758205 36:-0.816896
+1024 1:-0.682267 2:-0.640015 3:-0.380497 4:0.502658 5:-0.991011 6:-0.730524 7:-0.404121 8:0.655238 9:-0.990709 10:-0.732825 11:-0.292701 12:0.828084 13:-0.976474 14:-0.821115 15:-0.290636 16:0.8213 17:-0.975001 18:-0.820706 19:0.0584279 20:-0.241637 21:0.164699 22:0.668058 23:-0.844539 24:-0.131494 25:0.139444 26:0.689873 27:-0.847504 28:-0.224665 29:0.230571 30:0.155699 31:-0.649357 32:-0.633809 33:0.217832 34:0.210595 35:-0.653124 36:-0.612913
+-1024 1:-0.832133 2:-0.833537 3:-0.635087 4:0.0539656 5:-0.990744 6:-0.911036 7:-0.636245 8:0.154095 9:-0.990603 10:-0.916563 11:-0.600619 12:0.343971 13:-0.98343 14:-0.943991 15:-0.603366 16:0.342686 17:-0.982917 18:-0.945978 19:-0.566022 20:-0.654055 21:-0.475298 22:0.324009 23:-0.941573 24:-0.714654 25:-0.501792 26:0.324272 27:-0.933865 28:-0.750843 29:-0.444588 30:0.136221 31:-0.862045 32:-0.877856 33:-0.444588 34:0.10481 35:-0.86054 36:-0.885276
+415.0559181618593 1:-0.680611 2:-0.472496 3:-0.345138 4:0.539462 5:-0.953531 6:-0.535859 7:-0.363074 8:0.719993 9:-0.957196 10:-0.541207 11:-0.294225 12:0.588387 13:-0.898938 14:-0.724183 15:-0.296738 16:0.528922 17:-0.888989 18:-0.731585 19:0.050311 20:-0.0694314 21:0.131757 22:0.396315 23:-0.640436 24:0.0809264 25:0.115606 26:0.412831 27:-0.664213 28:-0.0601461 29:0.205093 30:-0.187187 31:-0.388211 32:-0.534179 33:0.192354 34:-0.183024 35:-0.374216 36:-0.519483
+-1024 1:-0.732775 2:-0.971605 3:-0.557297 4:-0.175884 5:-0.997995 6:-0.993424 7:-0.544245 8:-0.0941741 9:-0.997879 10:-0.994239 11:-0.408551 12:0.183133 13:-0.998356 14:-0.995634 15:-0.380641 16:0.192276 17:-0.998383 18:-0.996249 19:-0.137643 20:-0.893231 21:-0.221181 22:0.291617 23:-0.99839 24:-0.95529 25:-0.227657 26:0.290329 27:-0.99527 28:-0.960535 29:-0.118473 30:0.463157 31:-0.994504 32:-0.972822 33:-0.03185 34:0.430912 35:-0.994433 36:-0.976897
+-1024 1:-0.60527 2:-0.578292 3:-0.298463 4:0.486694 5:-0.975162 6:-0.669275 7:-0.247012 8:0.391215 9:-0.957788 10:-0.745391 11:-0.179899 12:0.447072 13:-0.930707 14:-0.832401 15:-0.182324 16:0.447274 17:-0.929785 18:-0.834 19:0.12711 20:-0.273552 21:0.0776393 22:0.614366 23:-0.840202 24:-0.17007 25:0.103687 26:0.111458 27:-0.703624 28:-0.460592 29:0.230571 30:-0.164229 31:-0.59647 32:-0.716307 33:0.245857 34:-0.226847 35:-0.583501 36:-0.728457
+1024 1:-0.688063 2:-0.221467 3:-0.360695 4:0.285852 5:-0.771522 6:-0.243172 7:-0.374397 8:0.486922 9:-0.799531 10:-0.256146 11:-0.332333 12:-0.0496303 13:-0.623908 14:-0.595724 15:-0.325721 16:-0.160646 17:-0.594575 18:-0.620478 19:0.0453155 20:0.0965427 21:0.143522 22:0.054919 23:-0.336381 24:0.23778 25:0.115606 26:0.0370141 27:-0.384859 28:0.034952 29:0.215284 30:-0.308308 31:-0.155254 32:-0.382356 33:0.20764 34:-0.352198 35:-0.127993 36:-0.384161
+1024 1:-0.652253 2:-0.774043 3:-0.422929 4:0.277351 5:-0.995602 6:-0.865772 7:-0.421106 8:0.384763 9:-0.994925 10:-0.87482 11:-0.309469 12:0.655221 13:-0.991041 14:-0.911699 15:-0.310468 16:0.672095 17:-0.991344 18:-0.91113 19:0.115871 20:-0.498718 21:0.0729334 22:0.607214 23:-0.938407 24:-0.513458 25:0.0583949 26:0.549745 27:-0.92427 28:-0.590015 29:0.248405 30:0.307172 31:-0.848503 32:-0.804991 33:0.230571 34:0.325949 35:-0.847643 36:-0.797122
+1024 1:-0.640249 2:-0.427315 3:-0.386155 4:0.80583 5:-0.962537 6:-0.348894 7:-0.283812 8:-0.10024 9:-0.829515 10:-0.718247 11:-0.268311 12:0.0110782 13:-0.796616 14:-0.7756 15:-0.273855 16:-0.0470267 17:-0.786408 18:-0.788985 19:0.021589 20:-0.184629 21:0.0658745 22:0.873897 23:-0.846135 24:0.153641 25:0.110839 26:-0.519167 27:-0.461532 28:-0.636966 29:0.197449 30:-0.461715 31:-0.415621 32:-0.686185 33:0.179615 34:-0.541743 35:-0.395215 36:-0.704261
+1024 1:-0.592852 2:-0.639172 3:-0.289977 4:0.461329 5:-0.98786 6:-0.747432 7:-0.259751 8:0.511463 9:-0.983222 10:-0.780178 11:-0.155509 12:0.625462 13:-0.964994 14:-0.853648 15:-0.164018 16:0.686842 17:-0.967881 18:-0.846375 19:0.267596 20:-0.316303 21:0.214111 22:0.61581 23:-0.863452 24:-0.259721 25:0.199039 26:0.361115 27:-0.804758 28:-0.442994 29:0.396175 30:-0.0351739 31:-0.671301 32:-0.733218 33:0.340123 34:0.0925559 35:-0.687541 36:-0.70079
+1024 1:-0.565945 2:-0.362538 3:-0.219258 4:0.360203 5:-0.894861 6:-0.502325 7:-0.231444 8:0.548407 9:-0.893762 10:-0.473494 11:-0.172277 12:0.193325 13:-0.794503 14:-0.709252 15:-0.173171 16:0.192823 17:-0.794077 18:-0.714382 19:0.130856 20:-0.0799948 21:0.185875 22:0.0807553 23:-0.600703 24:-0.230366 25:0.0655463 26:0.363756 27:-0.626365 28:-0.0313856 29:0.253501 30:-0.179341 31:-0.419619 32:-0.558007 33:0.245857 34:-0.12814 35:-0.432813 36:-0.545589
+1024 1:-0.702967 2:-0.540249 3:-0.45829 4:0.106934 5:-0.907711 6:-0.679884 7:-0.476305 8:0.483686 9:-0.940819 10:-0.612229 11:-0.451232 12:0.059214 13:-0.836506 14:-0.809995 15:-0.438611 16:0.300359 17:-0.870974 18:-0.779817 19:-0.0939305 20:-0.310587 21:-0.136477 22:0.0835834 23:-0.686722 24:-0.391074 25:-0.120388 26:0.358884 27:-0.762183 28:-0.330034 29:-0.0394934 30:-0.175757 31:-0.552791 32:-0.691724 33:-0.0140154 34:-0.12427 35:-0.572135 36:-0.685311
+-1024 1:-0.665087 2:-0.827262 3:-0.472432 4:0.107962 5:-0.993409 6:-0.913835 7:-0.456491 8:0.192298 9:-0.991618 10:-0.919336 11:-0.420746 12:0.423074 13:-0.986052 14:-0.939696 15:-0.418779 16:0.411287 17:-0.985652 18:-0.943399 19:-0.108293 20:-0.632834 21:-0.152946 22:0.472104 23:-0.961527 24:-0.695535 25:-0.156145 26:0.468659 27:-0.951428 28:-0.723348 29:-0.143951 30:0.4607 31:-0.906754 32:-0.827692 33:-0.154142 34:0.334365 35:-0.892368 36:-0.848431
+1024 1:-0.70069 2:-0.813611 3:-0.489405 4:0.166038 5:-0.995097 6:-0.900405 7:-0.510275 8:0.31011 9:-0.995652 10:-0.897465 11:-0.390259 12:0.56537 13:-0.992523 14:-0.931879 15:-0.376064 16:0.524359 17:-0.990839 18:-0.936979 19:0.0259594 20:-0.528684 21:-0.0776534 22:0.380449 23:-0.906463 24:-0.597958 25:-0.132307 26:0.487747 27:-0.913114 28:-0.59143 29:0.0649663 30:0.170174 31:-0.827386 32:-0.825262 33:0.0522273 34:0.0428074 35:-0.807755 36:-0.844481
+-1024 1:-0.669019 2:-0.731139 3:-0.438488 4:0.241404 5:-0.986956 6:-0.835195 7:-0.443752 8:0.330772 9:-0.983809 10:-0.842211 11:-0.40093 12:0.475363 13:-0.969628 14:-0.892765 15:-0.398947 16:0.450795 17:-0.966978 18:-0.896218 19:0.00847664 20:-0.464691 21:-0.0235354 22:0.462437 23:-0.908102 24:-0.533315 25:-0.0274197 26:0.412533 27:-0.884809 28:-0.582997 29:0.00127135 30:0.265437 31:-0.799071 32:-0.759513 33:-0.0140154 34:0.210466 35:-0.784639 36:-0.763299
+1024 1:-0.661775 2:-0.829052 3:-0.439902 4:0.17766 5:-0.998065 6:-0.91807 7:-0.432429 8:0.315018 9:-0.998 10:-0.916601 11:-0.349101 12:0.609917 13:-0.996301 14:-0.939291 15:-0.359283 16:0.624872 17:-0.996421 18:-0.938646 19:0.205783 20:-0.523937 21:0.192934 22:0.667056 23:-0.967375 24:-0.597436 25:0.230026 26:0.799497 27:-0.970965 28:-0.590902 29:0.327384 30:0.626375 31:-0.911374 32:-0.789128 33:0.324836 34:0.650898 35:-0.91456 36:-0.785713
+-991.3754921628238 1:-0.609824 2:-0.758344 3:-0.34231 4:0.368935 5:-0.998961 6:-0.864409 7:-0.339013 8:0.502107 9:-0.998662 10:-0.868738 11:-0.221055 12:0.824009 13:-0.99621 14:-0.901506 15:-0.223512 16:0.829281 17:-0.996337 18:-0.902398 19:0.686559 20:-0.374732 21:0.576461 22:0.907397 23:-0.956934 24:-0.366659 25:0.580441 26:0.921937 27:-0.950403 28:-0.421048 29:0.747766 30:0.765008 31:-0.873215 32:-0.683282 33:0.770696 34:0.737512 35:-0.872289 36:-0.692631
+-1024 1:-0.611479 2:-0.659489 3:-0.347967 4:0.371295 5:-0.986164 6:-0.778518 7:-0.350337 8:0.525565 9:-0.986344 10:-0.778967 11:-0.280506 12:0.571304 13:-0.961848 14:-0.856129 15:-0.273855 16:0.565072 17:-0.961425 18:-0.859887 19:0.237002 20:-0.356965 21:0.119992 22:0.455151 23:-0.867447 24:-0.431703 25:0.144212 26:0.490781 27:-0.862567 28:-0.466719 29:0.192354 30:0.166348 31:-0.720278 32:-0.708038 33:0.179615 34:0.145037 35:-0.715381 36:-0.711035
+1024 1:-0.658877 2:-0.594412 3:-0.421515 4:0.259599 5:-0.950706 6:-0.701185 7:-0.398459 8:0.544602 9:-0.969886 10:-0.691416 11:-0.315566 12:0.454979 13:-0.928874 14:-0.822816 15:-0.308942 16:0.323274 17:-0.91633 18:-0.844201 19:0.0727886 20:-0.268856 21:-0.00706475 22:0.327812 23:-0.751653 24:-0.259665 25:-0.0584092 26:0.298071 27:-0.717409 28:-0.299381 29:0.128659 30:-0.230743 31:-0.540478 32:-0.696941 33:0.133754 34:-0.442125 35:-0.497196 36:-0.745288
+-1024 1:-0.715594 2:-0.905182 3:-0.557297 4:0.0211812 5:-0.998043 6:-0.957351 7:-0.538583 8:0.118695 9:-0.997624 10:-0.959769 11:-0.496963 12:0.443588 13:-0.997683 14:-0.967464 15:-0.481324 16:0.441106 17:-0.99755 18:-0.969526 19:-0.283766 20:-0.739269 21:-0.225887 22:0.504608 23:-0.989312 24:-0.793899 25:-0.234808 26:0.539581 27:-0.985677 28:-0.804892 29:-0.17962 30:0.6469 31:-0.969525 32:-0.876865 33:-0.184715 34:0.561059 35:-0.964657 36:-0.890215
+1024 1:-0.776037 2:-1 3:-0.701563 4:-0.196573 5:-0.999261 6:-0.998043 7:-0.670214 8:-0.123644 9:-0.999184 10:-0.999178 11:-0.500012 12:0.130888 13:-0.999333 14:-0.999546 15:-0.513361 16:0.150137 17:-0.999321 18:-0.999587 19:-0.35433 20:-0.959593 21:-0.508239 22:0.183456 23:-0.999452 24:-0.986241 25:-0.501792 26:0.238085 27:-0.999414 28:-0.988309 29:-0.299366 30:0.395363 31:-0.999471 32:-0.992407 33:-0.304459 34:0.391281 35:-0.999469 36:-0.992302
+-1024 1:-0.724702 2:-0.980642 3:-0.602556 4:-0.195417 5:-0.998217 6:-0.995872 7:-0.558399 8:-0.115682 9:-0.99807 10:-0.996538 11:-0.442086 12:0.143035 13:-0.9983 14:-0.99756 15:-0.434034 16:0.160091 17:-0.998348 18:-0.997766 19:-0.322483 20:-0.933699 21:-0.423533 22:0.22232 23:-0.998915 24:-0.974756 25:-0.404056 26:0.284479 27:-0.998973 28:-0.976988 29:-0.253506 30:0.443002 31:-0.998579 32:-0.985234 33:-0.24841 34:0.435555 35:-0.998569 36:-0.985554
+-1024 1:-0.665914 2:-0.596565 3:-0.430001 4:0.601996 5:-0.9859 6:-0.625032 7:-0.302213 8:0.18957 9:-0.944393 10:-0.797306 11:-0.283554 12:0.379135 13:-0.926365 14:-0.841348 15:-0.286059 16:0.334971 17:-0.921715 18:-0.849772 19:-0.026489 20:-0.300766 21:0.0258766 22:0.918273 23:-0.91328 24:-0.0513558 25:0.0846167 26:-0.248772 27:-0.644178 28:-0.662768 29:0.151589 30:-0.276062 31:-0.577853 32:-0.746305 33:0.131206 34:-0.373988 35:-0.555652 36:-0.765332
+1024 1:-0.734224 2:-0.90472 3:-0.544567 4:0.035898 5:-0.998992 6:-0.9598 7:-0.555568 8:0.144739 9:-0.998914 10:-0.960539 11:-0.503061 12:0.470949 13:-0.998973 14:-0.969259 15:-0.508784 16:0.489422 17:-0.999045 18:-0.968707 19:-0.0314844 20:-0.678735 21:0.00705298 22:0.673282 23:-0.995139 24:-0.747598 25:-0.0178845 26:0.713905 27:-0.99313 28:-0.762733 29:0.166876 30:0.857508 31:-0.98206 32:-0.860491 33:0.169423 34:0.90663 35:-0.984678 36:-0.854201
+-368.9461123709166 1:-0.485847 2:-0.246486 3:-0.0905482 4:0.652361 5:-0.891073 6:-0.280219 7:-0.0870737 8:0.741846 9:-0.883641 10:-0.329192 11:-0.0198412 12:0.318365 13:-0.756281 14:-0.609323 15:-0.0251957 16:0.299154 17:-0.752341 18:-0.616769 19:0.586033 20:0.154255 21:0.494108 22:0.374873 23:-0.492142 24:0.367788 25:0.487475 26:0.25879 27:-0.447731 28:0.195272 29:0.610187 30:-0.298891 31:-0.146916 32:-0.349917 33:0.635665 34:-0.323982 35:-0.152388 36:-0.36903
+1024 1:-0.517514 2:-0.580929 3:-0.141465 4:0.505625 5:-0.983283 6:-0.716286 7:-0.18332 8:0.737757 9:-0.986532 10:-0.697729 11:-0.0442312 12:0.762544 13:-0.960733 14:-0.810182 15:-0.0389255 16:0.779409 17:-0.961782 18:-0.810138 19:0.585408 20:-0.156505 21:0.449401 22:0.475411 23:-0.76897 24:-0.166125 25:0.423112 26:0.764295 27:-0.81598 28:-0.077276 29:0.574518 30:0.0684104 31:-0.587975 32:-0.603131 33:0.5949 34:0.0303924 35:-0.579413 36:-0.61076
+-1024 1:-0.637144 2:-0.74512 3:-0.340895 4:0.272036 5:-0.99305 6:-0.859786 7:-0.354583 8:0.430277 9:-0.993068 10:-0.853091 11:-0.260689 12:0.590828 13:-0.983293 14:-0.903672 15:-0.292161 16:0.60986 17:-0.982479 18:-0.89841 19:0.223889 20:-0.42517 21:0.214111 22:0.430545 23:-0.893594 24:-0.536287 25:0.196656 26:0.702988 27:-0.922476 28:-0.465484 29:0.317193 30:0.181065 31:-0.78115 32:-0.769382 33:0.256048 34:0.34054 35:-0.801943 36:-0.734965
+1024 1:-0.5082 2:-0.526781 3:-0.222087 4:0.179758 5:-0.927033 6:-0.702497 7:-0.221536 8:0.418565 9:-0.941652 10:-0.686421 11:-0.138741 12:0.0900852 13:-0.870263 14:-0.841337 15:-0.147237 16:0.351068 17:-0.896571 18:-0.808223 19:0.166446 20:-0.316194 21:-0.0541238 22:0.254052 23:-0.751094 24:-0.339327 25:0.0297892 26:0.318906 27:-0.77871 28:-0.416426 29:0.210188 30:-0.181041 31:-0.628966 32:-0.752547 33:0.174519 34:-0.0126326 35:-0.65606 36:-0.713188
+-901.9112991469002 1:-0.487917 2:-0.575507 3:-0.144294 4:0.486463 5:-0.977838 6:-0.700604 7:-0.132367 8:0.569326 9:-0.971525 10:-0.716548 11:-0.117399 12:0.620327 13:-0.937055 14:-0.796115 15:-0.0785894 16:0.584552 17:-0.938566 18:-0.813843 19:0.650969 20:-0.189925 21:0.444696 22:0.534636 23:-0.800438 24:-0.184025 25:0.423112 26:0.486758 27:-0.771435 28:-0.25227 29:0.454774 30:0.295935 31:-0.608254 32:-0.519356 33:0.518466 34:-0.0744816 35:-0.54049 36:-0.618559
+-1024 1:-0.648321 2:-0.806582 3:-0.398884 4:0.18444 5:-0.996187 6:-0.906144 7:-0.422521 8:0.327577 9:-0.996078 10:-0.900864 11:-0.309469 12:0.576535 13:-0.992604 14:-0.932638 15:-0.318095 16:0.5912 17:-0.992726 18:-0.932065 19:0.065296 20:-0.508256 21:0.141169 22:0.540537 23:-0.94722 24:-0.611596 25:0.0250216 26:0.722744 27:-0.952399 28:-0.546107 29:0.217832 30:0.402048 31:-0.873507 32:-0.805693 33:0.215284 34:0.423172 35:-0.877978 36:-0.804247
+1024 1:-0.654323 2:-0.858907 3:-0.459704 4:0.0891521 5:-0.996922 6:-0.937114 7:-0.422521 8:0.202115 9:-0.996889 10:-0.940873 11:-0.339955 12:0.483192 13:-0.995547 14:-0.957406 15:-0.347079 16:0.507826 17:-0.995791 18:-0.956243 19:0.0309549 20:-0.630329 21:-0.0941241 22:0.529715 23:-0.969141 24:-0.687789 25:0.0393244 26:0.497956 27:-0.961133 28:-0.748517 29:0.100633 30:0.384584 31:-0.919798 32:-0.872644 33:0.108276 34:0.448381 35:-0.92583 36:-0.862918
+1024 1:-0.715801 2:-0.915741 3:-0.551639 4:-0.0259186 5:-0.997827 6:-0.967679 7:-0.531506 8:0.025005 9:-0.996078 10:-0.972681 11:-0.493915 12:0.330512 13:-0.996146 14:-0.977218 15:-0.461492 16:0.31385 17:-0.995875 18:-0.980567 19:-0.164494 20:-0.763431 21:-0.230593 22:0.454219 23:-0.991613 24:-0.838669 25:-0.179981 26:0.322856 27:-0.975429 28:-0.883219 29:-0.141403 30:0.493832 31:-0.968781 32:-0.916477 33:-0.115925 34:0.268421 35:-0.952361 36:-0.940794
+1024 1:-0.736915 2:-0.999254 3:-0.656304 4:-0.20235 5:-0.999217 6:-0.998551 7:-0.595198 8:-0.120963 9:-0.999243 10:-0.999217 11:-0.431417 12:0.130366 13:-0.999299 14:-0.999531 15:-0.432509 16:0.147027 17:-0.999283 18:-0.999655 19:-0.436759 20:-0.972021 21:-0.600002 22:0.17045 23:-0.999786 24:-0.990026 25:-0.539931 26:0.224431 27:-0.99979 28:-0.992673 29:-0.357963 30:0.378253 31:-0.999801 32:-0.99466 33:-0.317198 34:0.37128 35:-0.999823 36:-0.994994
+1024 1:-0.653495 2:-0.986923 3:-0.448389 4:-0.21714 5:-0.998074 6:-0.997752 7:-0.452245 8:-0.134781 9:-0.99846 10:-0.998737 11:-0.195142 12:0.117281 13:-0.998632 14:-0.99908 15:-0.188426 16:0.133946 17:-0.998665 18:-0.999275 19:-0.317487 20:-0.961907 21:-0.472945 22:0.176767 23:-0.999447 24:-0.988028 25:-0.466035 26:0.219059 27:-0.999119 28:-0.990576 29:-0.210193 30:0.366218 31:-0.999147 32:-0.993931 33:-0.20255 34:0.360372 35:-0.999138 36:-0.99399
+1024 1:-0.789906 2:-0.946829 3:-0.613871 4:-0.105158 5:-0.997782 6:-0.981994 7:-0.606521 8:-0.013672 9:-0.997896 10:-0.983865 11:-0.547268 12:0.281204 13:-0.998163 14:-0.987351 15:-0.565227 16:0.305101 17:-0.998316 18:-0.986897 19:-0.20446 20:-0.807305 21:-0.18824 22:0.431216 23:-0.995531 24:-0.879263 25:-0.189516 26:0.45182 27:-0.993521 28:-0.894576 29:-0.0496846 30:0.605584 31:-0.988746 32:-0.935099 33:-0.0904493 34:0.650455 35:-0.990425 36:-0.928845
+-1024 1:-0.67378 2:-0.728387 3:-0.459704 4:0.379784 5:-0.993811 6:-0.802558 7:-0.337598 8:0.255855 9:-0.982588 10:-0.873185 11:-0.300323 12:0.520579 13:-0.975947 14:-0.89961 15:-0.299789 16:0.497522 17:-0.974508 18:-0.904125 19:-0.0470964 20:-0.431521 21:-5.88239e-06 22:0.884362 23:-0.957299 24:-0.305083 25:0.0583949 26:0.0578488 27:-0.812262 28:-0.699797 29:0.141398 30:0.016872 31:-0.752889 32:-0.798512 33:0.118467 34:-0.0879768 35:-0.733405 36:-0.815113
+1024 1:-0.730912 2:-0.982912 3:-0.588412 4:-0.165321 5:-0.999149 6:-0.995142 7:-0.60369 8:-0.0727267 9:-0.999057 10:-0.994873 11:-0.51678 12:0.198269 13:-0.999118 14:-0.996114 15:-0.514886 16:0.215046 17:-0.999138 18:-0.996261 19:-0.193219 20:-0.892544 21:-0.256476 22:0.294429 23:-0.998271 24:-0.952695 25:-0.246727 26:0.341898 27:-0.997325 28:-0.954787 29:-0.149046 30:0.505955 31:-0.995926 32:-0.970284 33:-0.149046 34:0.499226 35:-0.9958 36:-0.970333
+1011.674312709385 1:-0.770448 2:-0.978241 3:-0.632259 4:-0.168909 5:-0.998652 6:-0.993911 7:-0.610767 8:-0.0857793 9:-0.998625 10:-0.994937 11:-0.493915 12:0.185949 13:-0.998879 14:-0.996343 15:-0.492003 16:0.202256 17:-0.998891 18:-0.996505 19:-0.171364 20:-0.89108 21:-0.157652 22:0.301757 23:-0.9987 24:-0.955817 25:-0.175213 26:0.36112 27:-0.998621 28:-0.958737 29:0.00636694 30:0.549736 31:-0.998155 32:-0.973149 33:0.0140103 34:0.544162 35:-0.998173 36:-0.973479
+-1024 1:-0.620793 2:-0.699857 3:-0.339481 4:0.321732 5:-0.989672 6:-0.820812 7:-0.351752 8:0.481601 9:-0.989113 10:-0.811978 11:-0.240872 12:0.606842 13:-0.97563 14:-0.880819 15:-0.247921 16:0.623744 17:-0.976223 18:-0.880333 19:0.0665449 20:-0.363575 21:0.138816 22:0.452728 23:-0.876766 24:-0.462288 25:0.0226402 26:0.68426 27:-0.889509 28:-0.350183 29:0.192354 30:0.145766 31:-0.741567 32:-0.739527 33:0.202545 34:0.180057 35:-0.750956 36:-0.736253
+-1024 1:-0.729877 2:-0.938188 3:-0.592655 4:-0.025049 5:-0.999711 6:-0.976601 7:-0.58529 8:0.0810064 9:-0.99968 10:-0.977209 11:-0.536597 12:0.392185 13:-0.999633 14:-0.98175 15:-0.537769 16:0.405989 17:-0.999669 18:-0.98199 19:0.132105 20:-0.754164 21:0.037639 22:0.543371 23:-0.996826 24:-0.844863 25:0.0607787 26:0.596561 27:-0.995581 28:-0.852581 29:0.210188 30:0.76608 31:-0.988967 32:-0.906232 33:0.215284 34:0.757651 35:-0.988914 36:-0.90748
+1024 1:-0.707728 2:-0.94503 3:-0.538909 4:-0.0908063 5:-0.998478 6:-0.983199 7:-0.534337 8:0.0137194 9:-0.998505 10:-0.982979 11:-0.437514 12:0.306114 13:-0.998751 14:-0.987734 15:-0.440137 16:0.322548 17:-0.99874 18:-0.987637 19:-0.0689521 20:-0.802445 21:-0.136477 22:0.428239 23:-0.995388 24:-0.882526 25:-0.0536416 26:0.469109 27:-0.994168 28:-0.89619 29:0.0522273 30:0.603473 31:-0.988768 32:-0.937304 33:0.0624185 34:0.607301 35:-0.988852 36:-0.936308
+1024 1:-0.695515 2:-0.900348 3:-0.482333 4:0.031854 5:-0.998697 6:-0.961107 7:-0.452245 8:0.122242 9:-0.998502 10:-0.965983 11:-0.373491 12:0.439782 13:-0.998405 14:-0.973227 15:-0.400473 16:0.458322 17:-0.998418 18:-0.972206 19:0.00660397 20:-0.672432 21:0.0588156 22:0.659579 23:-0.991915 24:-0.733955 25:0.13706 26:0.536771 27:-0.980731 28:-0.805859 29:0.253501 30:0.608202 31:-0.961922 32:-0.882276 33:0.215284 34:0.672785 35:-0.966461 36:-0.873178
+1024 1:-0.495575 2:-0.416556 3:-0.127323 4:0.711466 5:-0.962756 6:-0.491142 7:-0.0785813 8:0.734582 9:-0.950228 10:-0.552018 11:-0.00764625 12:0.652562 13:-0.895125 14:-0.707566 15:-0.0175681 16:0.634398 17:-0.890481 18:-0.707857 19:0.440551 20:0.00708395 21:0.388227 22:0.605546 23:-0.705804 24:0.170974 25:0.44695 26:0.326227 27:-0.625908 28:-0.0918007 29:0.50318 30:-0.0961451 31:-0.387068 32:-0.476661 33:0.50318 34:-0.0785125 35:-0.389629 36:-0.470574
+1024 1:-0.690133 2:-0.834792 3:-0.422929 4:0.144881 5:-0.997282 6:-0.924244 7:-0.411198 8:0.239127 9:-0.996656 10:-0.930592 11:-0.329284 12:0.53144 13:-0.994938 14:-0.947978 15:-0.324196 16:0.536124 17:-0.994816 18:-0.948832 19:0.151461 20:-0.571412 21:0.155287 22:0.641742 23:-0.972019 24:-0.637126 25:0.153747 26:0.503856 27:-0.948817 28:-0.708918 29:0.314645 30:0.449599 31:-0.91071 32:-0.843423 33:0.335028 34:0.410338 35:-0.906902 36:-0.848996
+1024 1:-0.624312 2:-0.780169 3:-0.35221 4:0.299396 5:-0.998068 6:-0.881493 7:-0.331936 8:0.395114 9:-0.997184 10:-0.892441 11:-0.218006 12:0.704512 13:-0.994834 14:-0.919598 15:-0.238767 16:0.734766 17:-0.995303 18:-0.916597 19:0.396844 20:-0.488389 21:0.345874 22:0.741908 23:-0.961559 24:-0.520068 25:0.318227 26:0.619289 27:-0.940091 28:-0.606338 29:0.523562 30:0.490424 31:-0.882635 32:-0.792605 33:0.470061 34:0.576876 35:-0.889515 36:-0.772993
+-1024 1:-0.49247 2:-0.599401 3:-0.149952 4:0.46648 5:-0.981042 6:-0.725593 7:-0.139442 8:0.56345 9:-0.976532 10:-0.738701 11:-0.114351 12:0.65145 13:-0.948107 14:-0.810239 15:-0.0755384 16:0.605092 17:-0.948339 18:-0.827869 19:0.669076 20:-0.215039 21:0.461166 22:0.575087 23:-0.825358 24:-0.211582 25:0.439799 26:0.535782 27:-0.799504 28:-0.275123 29:0.467513 30:0.335769 31:-0.642014 32:-0.540769 33:0.531205 34:-0.0308761 35:-0.577239 36:-0.635956
+-1024 1:-0.661982 2:-0.723477 3:-0.390398 4:0.415147 5:-0.996041 6:-0.811249 7:-0.314951 8:0.343357 9:-0.986964 10:-0.859451 11:-0.24697 12:0.600542 13:-0.980298 14:-0.894001 15:-0.269278 16:0.66285 17:-0.982592 18:-0.887204 19:-0.00338629 20:-0.373842 21:0.0964629 22:0.96185 23:-0.952531 24:-0.21089 25:0.179969 26:0.258874 27:-0.834285 28:-0.595026 29:0.233118 30:0.0382913 31:-0.726357 32:-0.763223 33:0.222927 34:0.188602 35:-0.756076 36:-0.738608
+1024 1:-0.714145 2:-0.89815 3:-0.553053 4:0.00762597 5:-0.997175 6:-0.95668 7:-0.56406 8:0.0888325 9:-0.995674 10:-0.957342 11:-0.533548 12:0.35339 13:-0.994121 14:-0.968301 15:-0.530142 16:0.364559 17:-0.994073 18:-0.968752 19:0.0303304 20:-0.692051 21:-5.88239e-06 22:0.508512 23:-0.984441 24:-0.782434 25:0.0369406 26:0.587116 27:-0.98544 28:-0.794587 29:0.108276 30:0.602942 31:-0.962169 32:-0.88151 33:0.105728 34:0.559729 35:-0.957957 36:-0.884355
+-1024 1:-0.713524 2:-0.975684 3:-0.579926 4:-0.163034 5:-0.99879 6:-0.993898 7:-0.600859 8:-0.0609943 9:-0.99878 10:-0.99287 11:-0.464952 12:0.205403 13:-0.998943 14:-0.995442 15:-0.438611 16:0.214508 17:-0.998874 18:-0.995889 19:-0.292509 20:-0.893922 21:-0.329417 22:0.248588 23:-0.996294 24:-0.954673 25:-0.35638 26:0.348478 27:-0.997618 28:-0.951714 29:-0.289175 30:0.458235 31:-0.993962 32:-0.970633 33:-0.245862 34:0.429581 35:-0.993484 36:-0.973523
+1024 1:-0.561806 2:-0.461499 3:-0.289977 4:0.548918 5:-0.949555 6:-0.524877 7:-0.213044 8:0.285589 9:-0.90861 10:-0.672058 11:-0.179899 12:0.15764 13:-0.84587 14:-0.790167 15:-0.19758 16:0.360364 17:-0.866426 18:-0.757813 19:0.160203 20:-0.223444 21:0.0752863 22:0.634158 23:-0.816195 24:-0.0701593 25:0.132293 26:-0.0443917 27:-0.653787 28:-0.512161 29:0.192354 30:-0.211965 31:-0.53222 32:-0.679189 33:0.171971 34:-0.0268856 35:-0.570003 36:-0.63396
+-1024 1:-0.57588 2:-0.764817 3:-0.256031 4:0.27729 5:-0.995982 6:-0.880634 7:-0.305044 8:0.424564 9:-0.995066 10:-0.870502 11:-0.118924 12:0.666716 13:-0.991338 14:-0.916542 15:-0.164018 16:0.726786 17:-0.99252 18:-0.909193 19:0.800821 20:-0.441575 21:0.534108 22:0.552404 23:-0.925916 24:-0.551446 25:0.404042 26:0.834812 27:-0.944896 28:-0.456672 29:0.826747 30:0.361578 31:-0.853546 32:-0.796079 33:0.71974 34:0.559568 35:-0.871713 36:-0.755379
+1024 1:-0.55539 2:-0.688563 3:-0.210771 4:0.413432 5:-0.994358 6:-0.818159 7:-0.262581 8:0.61015 9:-0.995419 10:-0.805661 11:-0.102156 12:0.810758 13:-0.986533 14:-0.868981 15:-0.0907922 16:0.811304 17:-0.986525 18:-0.870883 19:0.660959 20:-0.288294 21:0.510578 22:0.600151 23:-0.873041 24:-0.33377 25:0.456485 26:0.882909 27:-0.905605 28:-0.257963 29:0.643308 30:0.360934 31:-0.755696 32:-0.671592 33:0.666239 34:0.313623 35:-0.747418 36:-0.680155
+1024 1:-0.587057 2:-0.500021 3:-0.285733 4:0.547161 5:-0.959678 6:-0.566402 7:-0.234275 8:0.349836 9:-0.927565 10:-0.683096 11:-0.169228 12:0.319008 13:-0.886277 14:-0.797718 15:-0.173171 16:0.328486 17:-0.88593 18:-0.798375 19:0.140222 20:-0.214284 21:0.0846981 22:0.575412 23:-0.793808 24:-0.0791645 25:0.108455 26:0.00643077 27:-0.62566 28:-0.419773 29:0.240762 30:-0.257486 31:-0.518944 32:-0.684758 33:0.248405 34:-0.321023 35:-0.504011 36:-0.697547
+-1024 1:-0.472187 2:-0.233107 3:-0.100449 4:0.448388 5:-0.857395 6:-0.350356 7:-0.126705 8:0.867166 9:-0.890783 10:-0.249221 11:-0.0289875 12:0.278015 13:-0.737039 14:-0.598428 15:-0.0312979 16:0.133501 17:-0.714674 18:-0.633295 19:0.57854 20:0.15001 21:0.489402 22:0.0612835 23:-0.421762 24:0.0860615 25:0.394507 26:0.426439 27:-0.48231 28:0.347568 29:0.607639 30:-0.230928 31:-0.173663 32:-0.336908 33:0.640761 34:-0.436732 35:-0.127275 36:-0.407556
+1024 1:-0.710626 2:-0.975466 3:-0.575682 4:-0.169608 5:-0.998604 6:-0.994155 7:-0.537168 8:-0.0873025 9:-0.998611 10:-0.995318 11:-0.407027 12:0.179423 13:-0.998728 14:-0.996535 15:-0.409626 16:0.19607 17:-0.998714 18:-0.996621 19:-0.125153 20:-0.897044 21:-0.197652 22:0.294519 23:-0.998991 24:-0.959441 25:-0.21097 26:0.347321 27:-0.998496 28:-0.961068 29:-0.0242066 30:0.527632 31:-0.998008 32:-0.975169 33:-0.03185 34:0.520509 35:-0.997938 36:-0.975238
+-1024 1:-0.881398 2:-0.979604 3:-0.748239 4:-0.182956 5:-0.997942 6:-0.993099 7:-0.750891 8:-0.0937882 9:-0.998311 10:-0.994243 11:-0.672265 12:0.17242 13:-0.998428 14:-0.99587 15:-0.67354 16:0.192951 17:-0.998641 18:-0.996148 19:-0.602865 20:-0.905235 21:-0.451769 22:0.254953 23:-0.997728 24:-0.956901 25:-0.494641 26:0.305443 27:-0.997013 28:-0.958294 29:-0.393632 30:0.457695 31:-0.99574 32:-0.973915 33:-0.385989 34:0.453903 35:-0.996002 36:-0.974669
+1024 1:-0.599682 2:-0.650639 3:-0.308364 4:0.413305 5:-0.987936 6:-0.770703 7:-0.306459 8:0.601484 9:-0.989223 10:-0.764338 11:-0.199716 12:0.683033 13:-0.969576 14:-0.847896 15:-0.212833 16:0.720848 17:-0.971146 18:-0.843334 19:0.216397 20:-0.2692 21:0.225873 22:0.549666 23:-0.8548 24:-0.306212 25:0.225259 26:0.706522 27:-0.863003 28:-0.265365 29:0.28917 30:0.203072 31:-0.683762 32:-0.651106 33:0.294263 34:0.250508 35:-0.694044 36:-0.642048
+1024 1:-0.626174 2:-0.788166 3:-0.37484 4:0.246962 5:-0.9961 6:-0.884856 7:-0.351752 8:0.290667 9:-0.992384 10:-0.897923 11:-0.291176 12:0.578881 13:-0.988995 14:-0.921359 15:-0.283008 16:0.566875 17:-0.988607 18:-0.924961 19:0.338152 20:-0.47065 21:0.282344 22:0.770924 23:-0.962869 24:-0.494227 25:0.342065 26:0.657228 27:-0.944558 28:-0.595275 29:0.439487 30:0.607574 31:-0.887312 32:-0.758739 33:0.442035 34:0.456781 35:-0.869719 36:-0.784106
+-1024 1:-0.629693 2:-0.599695 3:-0.390398 4:0.37422 5:-0.966359 6:-0.688127 7:-0.368735 8:0.398018 9:-0.958657 10:-0.733445 11:-0.288128 12:0.372331 13:-0.924726 14:-0.840421 15:-0.30284 16:0.390114 17:-0.923428 18:-0.836542 19:0.132105 20:-0.319558 21:0.0235237 22:0.468759 23:-0.813386 24:-0.256593 25:0.0250216 26:0.265347 27:-0.761986 28:-0.431314 29:0.245857 30:-0.155401 31:-0.627494 32:-0.740574 33:0.220379 34:-0.120006 35:-0.625914 36:-0.725844
+-653.3157792031579 1:-0.50344 2:-0.480209 3:-0.116007 4:0.533496 5:-0.96227 6:-0.610619 7:-0.152181 8:0.78902 9:-0.968613 10:-0.584289 11:-0.0457555 12:0.616156 13:-0.915177 14:-0.756745 15:-0.0450277 16:0.653195 17:-0.918475 18:-0.753878 19:0.539829 20:-0.0583501 21:0.418813 22:0.344163 23:-0.660443 24:-0.0527262 25:0.399274 26:0.630752 27:-0.718743 28:0.0445256 29:0.543944 30:-0.118475 31:-0.443786 32:-0.543026 33:0.564327 34:-0.139555 35:-0.438493 36:-0.547682
+-1024 1:-0.721804 2:-0.972925 3:-0.56154 4:-0.161806 5:-0.998658 6:-0.993488 7:-0.593782 8:-0.0611432 9:-0.998648 10:-0.992536 11:-0.420746 12:0.207575 13:-0.998946 14:-0.995427 15:-0.432509 16:0.224812 17:-0.998885 18:-0.995326 19:-0.172612 20:-0.880073 21:-0.237652 22:0.323684 23:-0.998584 24:-0.945642 25:-0.232425 26:0.390074 27:-0.998459 28:-0.94743 29:-0.0522324 30:0.56492 31:-0.997546 32:-0.968147 33:-0.0598758 34:0.562663 35:-0.997511 36:-0.967682
+1024 1:-0.558081 2:-0.433339 3:-0.216429 4:0.453265 5:-0.936907 6:-0.560206 7:-0.218705 8:0.678946 9:-0.942404 10:-0.539072 11:-0.164655 12:0.395895 13:-0.861712 14:-0.734688 15:-0.176222 16:0.45358 17:-0.865453 18:-0.72402 19:0.197041 20:-0.0740919 21:0.207052 22:0.262095 23:-0.65806 24:-0.136353 25:0.191888 26:0.41954 27:-0.668499 28:-0.0671694 29:0.278979 30:-0.0438173 31:-0.443391 32:-0.512116 33:0.276431 34:0.00499823 35:-0.457141 36:-0.501225
+1024 1:-0.847244 2:-0.93636 3:-0.78077 4:-0.00245685 5:-0.999961 6:-0.964376 7:-0.784861 8:0.111864 9:-0.999975 10:-0.963762 11:-0.734762 12:0.418365 13:-0.999913 14:-0.974751 15:-0.736086 16:0.430468 17:-0.999922 18:-0.975045 19:-0.343714 20:-0.742197 21:-0.284711 22:0.561037 23:-0.996728 24:-0.80691 25:-0.353996 26:0.689126 27:-0.999217 28:-0.807388 29:-0.0955449 30:0.883173 31:-0.995845 32:-0.895 33:-0.0904493 34:0.872126 35:-0.995555 36:-0.89588
+-1024 1:-0.840413 2:-0.941864 3:-0.770869 4:-0.0171189 5:-0.999879 6:-0.968329 7:-0.766461 8:0.0920889 9:-0.999902 10:-0.969214 11:-0.722567 12:0.398702 13:-0.999852 14:-0.977601 15:-0.723881 16:0.41209 17:-0.999849 18:-0.977673 19:-0.318112 20:-0.755717 21:-0.317652 22:0.582251 23:-0.999171 24:-0.82093 25:-0.306322 26:0.654851 27:-0.998938 28:-0.82921 29:-0.103188 30:0.896988 31:-0.997745 32:-0.902251 33:-0.105736 34:0.895553 35:-0.997839 36:-0.902384
+-1024 1:-0.84062 2:-0.941765 3:-0.770869 4:-0.0164439 5:-0.999879 6:-0.968162 7:-0.766461 8:0.0921634 9:-0.999885 10:-0.969052 11:-0.721043 12:0.398893 13:-0.999834 14:-0.977511 15:-0.723881 16:0.411458 17:-0.999829 18:-0.977631 19:-0.323107 20:-0.755495 21:-0.320005 22:0.581606 23:-0.999137 24:-0.820458 25:-0.313474 26:0.655486 27:-0.998989 28:-0.828861 29:-0.108281 30:0.899775 31:-0.997851 32:-0.90203 33:-0.110829 34:0.895827 35:-0.997842 36:-0.902114
+-1024 1:-0.693031 2:-0.511415 3:-0.398884 4:0.530224 5:-0.973163 6:-0.613606 7:-0.40129 8:0.706487 9:-0.973418 10:-0.611484 11:-0.0899608 12:0.642674 13:-0.949792 14:-0.817308 15:-0.0892666 16:0.655784 17:-0.950537 18:-0.817699 19:0.0977639 20:-0.196726 21:0.0564627 22:0.245078 23:-0.717752 24:-0.280662 25:0.0583949 26:0.284277 27:-0.714641 28:-0.312949 29:0.136302 30:0.0331036 31:-0.562184 32:-0.59891 33:0.103181 34:0.0342056 35:-0.548907 36:-0.585246
+1024 1:-0.689512 2:-0.390961 3:-0.441317 4:0.194614 5:-0.859537 6:-0.508362 7:-0.439506 8:0.388967 9:-0.871537 10:-0.493721 11:-0.396356 12:-0.0657654 13:-0.747392 14:-0.747446 15:-0.398947 16:-0.0830144 17:-0.741661 18:-0.754501 19:0.0827788 20:-0.139687 21:0.0329355 22:0.0412365 23:-0.593865 24:-0.262888 25:0.0560111 26:0.0724915 27:-0.599231 28:-0.308536 29:0.123563 30:-0.0778755 31:-0.461729 32:-0.552576 33:0.103181 34:-0.0837606 35:-0.454426 36:-0.549127
+-1024 1:-0.845795 2:-0.774421 3:-0.806228 4:-0.0832533 5:-0.939959 6:-0.829307 7:-0.80326 8:-0.408492 9:-0.854563 10:-0.914157 11:-0.795737 12:-0.189172 13:-0.873867 14:-0.936448 15:-0.798633 16:-0.176213 17:-0.871749 18:-0.937088 19:-0.265657 20:-0.586779 21:-0.230593 22:0.458874 23:-0.942654 24:-0.628969 25:-0.294403 26:0.196764 27:-0.892996 28:-0.755944 29:-0.113377 30:0.324572 31:-0.895978 32:-0.857063 33:-0.115925 34:0.303521 35:-0.89377 36:-0.859982
+1024 1:-0.743746 2:-0.431857 3:-0.520522 4:0.530267 5:-0.913875 6:-0.348898 7:-0.47489 8:0.171548 9:-0.889726 10:-0.672713 11:-0.384161 12:0.0468151 13:-0.841835 14:-0.819831 15:-0.386743 16:0.0910031 17:-0.848071 18:-0.819189 19:-0.0233668 20:-0.114246 21:0.00470003 22:0.68868 23:-0.744836 24:0.237674 25:0.0631625 26:-0.580036 27:-0.369993 28:-0.587021 29:0.146493 30:-0.466484 31:-0.338699 32:-0.62964 33:0.131206 34:-0.551264 35:-0.309881 36:-0.646313
+1024 1:-0.8193 2:-0.898658 3:-0.776527 4:-0.000243253 5:-0.996304 6:-0.941084 7:-0.750891 8:0.0716977 9:-0.995907 10:-0.953623 11:-0.634155 12:0.282854 13:-0.993623 14:-0.975097 15:-0.638453 16:0.305238 17:-0.993875 18:-0.974711 19:-0.479222 20:-0.764375 21:-0.552943 22:0.230362 23:-0.95884 24:-0.83117 25:-0.563769 26:0.184728 27:-0.949964 28:-0.878856 29:-0.518472 30:0.247909 31:-0.93616 32:-0.926511 33:-0.523567 34:0.244873 35:-0.936228 36:-0.926539
+1024 1:-0.759891 2:-0.361136 3:-0.541738 4:0.39351 5:-0.838333 6:-0.250889 7:-0.541414 8:-0.875249 9:-0.501545 10:-0.765528 11:-0.500012 12:-0.780747 13:-0.534692 14:-0.815728 15:-0.505733 16:-0.765029 17:-0.530905 18:-0.818442 19:-0.026489 20:-0.21509 21:-0.0141236 22:0.584237 23:-0.781831 24:-0.017383 25:-0.0369549 26:-0.340791 27:-0.533432 28:-0.598044 29:0.0904442 30:-0.183466 31:-0.545033 32:-0.682467 33:0.0904442 34:-0.128898 35:-0.561393 36:-0.673501
+-1024 1:-0.722632 2:-0.304851 3:-0.44556 4:0.902425 5:-0.927831 6:-0.0618516 7:-0.402705 8:-0.507198 9:-0.629091 10:-0.671912 11:-0.379588 12:-0.485433 13:-0.597517 14:-0.749323 15:-0.391319 16:-0.533681 17:-0.580865 18:-0.762536 19:-0.0464713 20:-0.0871068 21:0.037639 22:0.765214 23:-0.75375 24:0.318295 25:0.0607787 26:-0.76484 27:-0.259239 28:-0.617774 29:0.166876 30:-0.586211 31:-0.259607 32:-0.620875 33:0.146493 34:-0.636186 35:-0.242896 36:-0.632582
+-1024 1:-0.820335 2:-0.954169 3:-0.756725 4:-0.0621997 5:-0.999337 6:-0.976238 7:-0.74806 8:0.0422279 9:-0.999411 10:-0.977269 11:-0.702751 12:0.340895 13:-0.999441 14:-0.983315 15:-0.705575 16:0.354639 17:-0.99944 18:-0.983436 19:-0.414903 20:-0.798882 21:-0.437651 22:0.506217 23:-0.999375 24:-0.856624 25:-0.430278 26:0.575592 27:-0.99934 28:-0.866498 29:-0.256054 30:0.80634 31:-0.998819 32:-0.925061 33:-0.256054 34:0.803457 35:-0.998813 36:-0.924936
+-840.7368286996084 1:-0.780591 2:-0.770293 3:-0.613871 4:0.226791 5:-0.993314 6:-0.850984 7:-0.606521 8:0.345063 9:-0.992777 10:-0.857568 11:-0.474098 12:0.54609 13:-0.987991 14:-0.918359 15:-0.475222 16:0.533775 17:-0.986911 18:-0.920415 19:-0.0895595 20:-0.485094 21:-0.115301 22:0.464253 23:-0.913344 24:-0.539093 25:-0.084631 26:0.49427 27:-0.908296 28:-0.57563 29:0.0394883 30:0.45022 31:-0.869077 32:-0.780689 33:0.0242015 34:0.35664 35:-0.852814 36:-0.791685
+1024 1:-0.734638 2:-0.504281 3:-0.529008 4:0.118294 5:-0.888029 6:-0.618225 7:-0.528675 8:0.0695042 9:-0.859595 10:-0.668518 11:-0.512207 12:0.0491959 13:-0.817712 14:-0.791099 15:-0.508784 16:-0.0446429 17:-0.799792 18:-0.809176 19:-0.0801924 20:-0.292779 21:-0.129418 22:0.168251 23:-0.728059 24:-0.38112 25:-0.127539 26:0.0288893 27:-0.667131 28:-0.475277 29:-0.0522324 30:0.13722 31:-0.637769 32:-0.632592 33:-0.0624236 34:0.106825 35:-0.626142 36:-0.634477
+-1024 1:-0.730084 2:-0.59523 3:-0.519108 4:0.277606 5:-0.958305 6:-0.698681 7:-0.486213 8:0.380125 9:-0.958941 10:-0.728853 11:-0.384161 12:0.409363 13:-0.939553 14:-0.852724 15:-0.392845 16:0.342678 17:-0.931007 18:-0.860292 19:-0.0464713 20:-0.32163 21:-0.0800064 22:0.287548 23:-0.793703 24:-0.399276 25:-0.0870148 26:0.187908 27:-0.749185 28:-0.475839 29:-0.0420412 30:0.201848 31:-0.680637 32:-0.649555 33:-0.0547802 34:0.140869 35:-0.662004 36:-0.657007
+1024 1:-0.73381 2:-0.503116 3:-0.52618 4:0.12772 5:-0.889573 6:-0.615328 7:-0.525844 8:0.0698901 9:-0.859685 10:-0.668673 11:-0.509158 12:0.0544439 13:-0.81782 14:-0.789338 15:-0.505733 16:-0.0426777 17:-0.79902 18:-0.80754 19:-0.083939 20:-0.289995 21:-0.127065 22:0.15885 23:-0.723643 24:-0.382365 25:-0.127539 26:0.0312155 27:-0.667054 28:-0.472926 29:-0.0547802 30:0.140337 31:-0.633618 32:-0.626584 33:-0.0675192 34:0.102206 35:-0.619588 36:-0.629793
+1024 1:-0.742917 2:-0.67478 3:-0.473847 4:0.389594 5:-0.98822 6:-0.761383 7:-0.486213 8:0.497429 9:-0.987685 10:-0.780323 11:-0.30642 12:0.645767 13:-0.978322 14:-0.875638 15:-0.310468 16:0.666302 17:-0.978859 18:-0.87406 19:0.296318 20:-0.334486 21:0.289403 22:0.592322 23:-0.872067 24:-0.320408 25:0.175201 26:0.596353 27:-0.866725 28:-0.382335 29:0.447131 30:0.340844 31:-0.784727 32:-0.714782 33:0.424201 34:0.393973 35:-0.790207 36:-0.700648
+1024 1:-0.721183 2:-0.610617 3:-0.430001 4:0.439102 5:-0.980152 6:-0.699352 7:-0.443752 8:0.530115 9:-0.977864 10:-0.72471 11:-0.230201 12:0.611629 13:-0.964642 14:-0.853712 15:-0.235716 16:0.622018 17:-0.964458 18:-0.852847 19:0.243246 20:-0.28834 21:0.218814 22:0.455007 23:-0.808886 24:-0.277937 25:0.120374 26:0.604214 27:-0.845851 28:-0.308292 29:0.357958 30:0.199778 31:-0.724445 32:-0.698462 33:0.317193 34:0.255933 35:-0.729432 36:-0.681173
+1024 1:-0.848279 2:-0.926571 3:-0.755311 4:0.024605 5:-0.999938 6:-0.958449 7:-0.746645 8:0.129114 9:-0.999921 10:-0.961831 11:-0.698177 12:0.444909 13:-0.999898 14:-0.972258 15:-0.704049 16:0.457656 17:-0.999904 18:-0.972191 19:-0.252543 20:-0.721035 21:-0.195299 22:0.658865 23:-0.999322 24:-0.789422 25:-0.239576 26:0.721985 27:-0.999068 28:-0.799005 29:-0.00127645 30:0.991163 31:-0.998014 32:-0.884402 33:-0.0140154 34:0.990931 35:-0.997977 36:-0.882827
+-1024 1:-0.680404 2:-0.496548 3:-0.367768 4:0.586768 5:-0.969122 6:-0.562185 7:-0.333352 8:0.646796 9:-0.965959 10:-0.628261 11:-0.0167925 12:0.610769 13:-0.943957 14:-0.816351 15:-0.0282468 16:0.654973 17:-0.946267 18:-0.810776 19:0.112749 20:-0.206338 21:0.124698 22:0.356769 23:-0.71188 24:-0.140073 25:0.0798491 26:0.417702 27:-0.748552 28:-0.248509 29:0.240762 30:-0.130824 31:-0.561203 32:-0.669609 33:0.197449 34:-0.00137854 35:-0.578699 36:-0.630534
+1024 1:-0.683302 2:-0.310563 3:-0.439902 4:0.288644 5:-0.80577 6:-0.302549 7:-0.422521 8:0.109088 9:-0.75869 10:-0.461596 11:-0.355198 12:-0.415541 13:-0.625075 14:-0.747262 15:-0.365385 16:-0.267387 17:-0.646055 18:-0.721523 19:0.08153 20:-0.0759286 21:0.0517568 22:0.173081 23:-0.512556 24:0.0620175 25:0.0965357 26:-0.196797 27:-0.396632 28:-0.258193 29:0.238214 30:-0.514171 31:-0.277207 32:-0.596037 33:0.205093 34:-0.387298 35:-0.300316 36:-0.55589
+1024 1:-0.832133 2:-0.943727 3:-0.749653 4:-0.0341162 5:-0.999545 6:-0.971251 7:-0.74806 8:0.0749473 9:-0.999568 10:-0.97144 11:-0.690556 12:0.37559 13:-0.999546 14:-0.97988 15:-0.691845 16:0.388149 17:-0.999541 18:-0.980098 19:-0.342466 20:-0.761368 21:-0.320005 22:0.564936 23:-0.999137 24:-0.830879 25:-0.323009 26:0.642507 27:-0.998866 28:-0.833419 29:-0.0929971 30:0.876648 31:-0.997956 32:-0.909116 33:-0.0980927 34:0.874061 35:-0.998057 36:-0.909491
+1024 1:-0.683095 2:-0.100163 3:-0.404542 4:0.136112 5:-0.649559 6:-0.12922 7:-0.392797 8:0.228972 9:-0.617932 10:-0.100955 11:-0.289652 12:-0.465614 13:-0.491682 14:-0.635037 15:-0.293687 16:-0.536201 17:-0.460016 18:-0.646135 19:0.220143 20:0.165785 21:0.277638 22:-0.270723 23:-0.23671 24:-0.0265839 25:0.237178 26:-0.0949332 27:-0.232853 28:0.132416 29:0.37834 30:-0.363463 31:-0.091318 32:-0.346233 33:0.340123 34:-0.113814 35:-0.141716 36:-0.245386
+1024 1:-0.684751 2:-0.273592 3:-0.411614 4:0.24647 5:-0.814764 6:-0.367418 7:-0.411198 8:0.359802 9:-0.791825 10:-0.336451 11:-0.280506 12:-0.0488917 13:-0.727641 14:-0.715672 15:-0.308942 16:-0.0849795 17:-0.705533 18:-0.711972 19:0.163949 20:0.0300424 21:0.232932 22:-0.155468 23:-0.407974 24:-0.162485 25:0.16805 26:0.106407 27:-0.436458 28:0.0289349 29:0.314645 30:-0.318812 31:-0.247939 32:-0.467874 33:0.250953 34:-0.047338 35:-0.29352 36:-0.362384
+-1024 1:-0.791148 2:-0.742322 3:-0.605385 4:0.273276 5:-0.992786 6:-0.826158 7:-0.599444 8:0.418112 9:-0.991873 10:-0.821244 11:-0.452757 12:0.587987 13:-0.985729 14:-0.903308 15:-0.459967 16:0.601675 17:-0.985736 18:-0.90221 19:-0.0202447 20:-0.384273 21:0.0470509 22:0.464978 23:-0.879528 24:-0.450848 25:-0.00596547 26:0.553195 27:-0.866187 28:-0.40593 29:0.230571 30:0.403539 31:-0.820056 32:-0.734247 33:0.169423 34:0.432435 35:-0.816693 36:-0.717303
+-1024 1:-0.699034 2:-0.300126 3:-0.431416 4:0.131661 5:-0.787121 6:-0.404238 7:-0.416859 8:0.276808 9:-0.781693 10:-0.380012 11:-0.358247 12:-0.210355 13:-0.656436 14:-0.701251 15:-0.366911 16:-0.197479 17:-0.646488 18:-0.694353 19:-0.0333583 20:-0.0285756 21:0.115287 22:-0.275436 23:-0.36711 24:-0.244882 25:0.0583949 26:0.0286645 27:-0.424504 28:-0.0469962 29:0.197449 30:-0.341955 31:-0.263367 32:-0.500534 33:0.108276 34:-0.0933943 35:-0.295083 36:-0.3963
+-1024 1:-0.608375 2:0.013959 3:-0.253202 4:0.49744 5:-0.686228 6:0.147901 7:-0.245596 8:0.166187 9:-0.573063 10:-0.0919888 11:-0.15246 12:-0.41959 13:-0.401192 14:-0.51417 15:-0.150288 16:-0.427726 17:-0.39491 18:-0.522545 19:0.32504 20:0.31478 21:0.352933 22:0.0386214 23:-0.212642 24:0.446797 25:0.375438 26:-0.432395 27:-0.0171099 28:0.0521964 29:0.477704 30:-0.263713 31:0.0457103 32:-0.148831 33:0.470061 34:-0.258086 35:0.0458365 36:-0.146935
+704.9442341689459 1:-0.611272 2:0.0183916 3:-0.254616 4:0.481367 5:-0.677424 6:0.150922 7:-0.245596 8:0.167304 9:-0.568533 10:-0.0828297 11:-0.153984 12:-0.429842 13:-0.391942 14:-0.509642 15:-0.154865 16:-0.439969 17:-0.382904 18:-0.516557 19:0.311928 20:0.328747 21:0.348227 22:0.0104469 23:-0.179896 24:0.464717 25:0.373055 26:-0.40211 27:-0.0201257 28:0.0847027 29:0.475156 30:-0.25573 31:0.0608979 32:-0.128789 33:0.470061 34:-0.21157 35:0.0450807 36:-0.116729
+1024 1:-0.599061 2:0.0283433 3:-0.237645 4:0.49688 5:-0.679671 6:0.15878 7:-0.224367 8:0.186463 9:-0.569481 10:-0.0692323 11:-0.141789 12:-0.431259 13:-0.380725 14:-0.498019 15:-0.139609 16:-0.434732 17:-0.376123 18:-0.50612 19:0.326913 20:0.325647 21:0.364697 22:-0.0054192 23:-0.17952 24:0.441607 25:0.394507 26:-0.39035 27:-0.0349678 28:0.0797814 29:0.490441 30:-0.263802 31:0.0603333 32:-0.13399 33:0.480252 34:-0.228451 35:0.0495191 36:-0.123188
+-1024 1:-0.598854 2:0.0202499 3:-0.239059 4:0.484213 5:-0.680623 6:0.142791 7:-0.227198 8:0.209102 9:-0.583007 10:-0.0734353 11:-0.141789 12:-0.40276 13:-0.39719 14:-0.501416 15:-0.138084 16:-0.420224 17:-0.388901 18:-0.512052 19:0.321918 20:0.316798 21:0.352933 22:0.0164653 23:-0.199757 24:0.437255 25:0.387357 26:-0.392204 27:-0.0434163 28:0.0646042 29:0.4828 30:-0.289354 31:0.0569679 32:-0.154521 33:0.470061 34:-0.250951 35:0.0471101 36:-0.140978
+1024 1:-0.723874 2:-0.999711 3:-0.558711 4:-0.223659 5:-0.998981 6:-0.999744 7:-0.558399 8:-0.134645 9:-0.999018 10:-0.999653 11:-0.51678 12:0.13081 13:-0.999095 14:-0.999204 15:-0.519463 16:0.147522 17:-0.999065 18:-0.999313 19:-0.655943 20:-0.990013 21:-0.800001 22:0.145748 23:-0.999727 24:-0.991773 25:-0.814066 26:0.209327 27:-0.99984 28:-0.992414 29:-0.755414 30:0.360877 31:-0.999791 32:-0.99383 33:-0.757962 34:0.356116 35:-0.99981 36:-0.993852
+-532.5557952898562 1:-0.693859 2:-0.453659 3:-0.350796 4:0.597046 5:-0.965522 6:-0.542298 7:-0.360243 8:0.785865 9:-0.967199 10:-0.54134 11:-0.0167925 12:0.768279 13:-0.945776 14:-0.773768 15:-0.0221447 16:0.670634 17:-0.937985 18:-0.788801 19:0.310679 20:-0.0484984 21:0.24705 22:0.260193 23:-0.608341 24:-0.0320453 25:0.2634 26:0.291177 27:-0.610861 28:-0.0936786 29:0.286622 30:0.291512 31:-0.479464 32:-0.367008 33:0.319741 34:-0.0628567 35:-0.406742 36:-0.484192
+1024 1:-0.625347 2:-0.265874 3:-0.301292 4:0.329048 5:-0.827564 6:-0.344929 7:-0.313536 8:0.340656 9:-0.797107 10:-0.377825 11:-0.285079 12:0.0326177 13:-0.674185 14:-0.618886 15:-0.263176 16:0.0154819 17:-0.685066 18:-0.646526 19:0.273216 20:0.0615286 21:0.237638 22:-0.0198899 23:-0.388912 24:0.0389523 25:0.225259 26:-0.114122 27:-0.341283 28:-0.0704021 29:0.296811 30:0.147547 31:-0.296203 32:-0.243497 33:0.345219 34:-0.240245 35:-0.205903 36:-0.384329
+-1024 1:-0.622656 2:-0.27198 3:-0.299877 4:0.322584 5:-0.829524 6:-0.355393 7:-0.310705 8:0.350351 9:-0.803837 10:-0.385541 11:-0.286603 12:0.0352504 13:-0.677396 14:-0.62172 15:-0.263176 16:0.0220437 17:-0.689925 18:-0.649646 19:0.284455 20:0.0620362 21:0.254109 22:-0.026851 23:-0.382373 24:0.041939 25:0.237178 26:-0.13227 27:-0.332289 28:-0.0763943 29:0.301906 30:0.120021 31:-0.275847 32:-0.238341 33:0.350314 34:-0.25761 35:-0.192421 36:-0.381244
+1024 1:-0.737122 2:-0.583486 3:-0.448389 4:0.469101 5:-0.979554 6:-0.673932 7:-0.442337 8:0.611713 9:-0.979129 10:-0.685946 11:-0.219531 12:0.748894 13:-0.968542 14:-0.826026 15:-0.23114 16:0.639269 17:-0.960241 18:-0.838588 19:0.266972 20:-0.177226 21:0.22352 22:0.483613 23:-0.769449 24:-0.140173 25:0.251481 26:0.492995 27:-0.762979 28:-0.21428 29:0.352862 30:0.546345 31:-0.695892 32:-0.504042 33:0.368149 34:0.141925 35:-0.614107 36:-0.600077
+-1024 1:-0.764652 2:-0.589071 3:-0.527594 4:0.566901 5:-0.986226 6:-0.623178 7:-0.513106 8:0.623663 9:-0.9818 10:-0.676321 11:-0.30642 12:0.732568 13:-0.969619 14:-0.828643 15:-0.308942 16:0.770249 17:-0.971315 18:-0.824168 19:-0.0414759 20:-0.164278 21:0.0541097 22:0.67634 23:-0.786795 24:0.0809465 25:0.00118594 26:0.447943 27:-0.721093 28:-0.142682 29:0.215284 30:-0.0323867 31:-0.562678 32:-0.628262 33:0.230571 34:0.140603 35:-0.608739 36:-0.595913
+1024 1:-0.790734 2:-0.992841 3:-0.69732 4:-0.1877 5:-0.99884 6:-0.996163 7:-0.674461 8:-0.102745 9:-0.998906 10:-0.996936 11:-0.649399 12:0.172907 13:-0.998967 14:-0.996918 15:-0.653708 16:0.193515 17:-0.999048 18:-0.996966 19:-0.64283 20:-0.949731 21:-0.745883 22:0.227838 23:-0.999627 24:-0.968738 25:-0.711564 26:0.286761 27:-0.999561 28:-0.973117 29:-0.656053 30:0.464043 31:-0.999564 32:-0.981918 33:-0.663696 34:0.461103 35:-0.999585 36:-0.981656
+1024 1:-0.729049 2:-0.200198 3:-0.430001 4:0.24768 5:-0.723753 6:-0.161438 7:-0.450829 8:0.375095 9:-0.74375 10:-0.210316 11:-0.382637 12:-0.337151 13:-0.53287 14:-0.627791 15:-0.382166 16:-0.389029 17:-0.514432 18:-0.642475 19:-0.0489697 20:0.110494 21:0.0846981 22:0.0221961 23:-0.283657 24:0.287293 25:0.0583949 26:-0.130905 27:-0.276982 28:0.00364115 29:0.171971 30:-0.429445 31:-0.0802465 32:-0.388725 33:0.164328 34:-0.45446 35:-0.0544625 36:-0.382015
+1024 1:-0.726979 2:-0.198776 3:-0.425758 4:0.255099 5:-0.727078 6:-0.161985 7:-0.443752 8:0.41034 9:-0.752872 10:-0.200224 11:-0.379588 12:-0.299224 13:-0.542012 14:-0.620046 15:-0.376064 16:-0.35968 17:-0.523091 18:-0.637478 19:-0.0758214 20:0.108155 21:0.0682275 22:-0.0168488 23:-0.263876 24:0.264665 25:0.0393244 26:-0.0933543 27:-0.295309 28:0.0199591 29:0.13885 30:-0.39728 31:-0.0835157 32:-0.375127 33:0.133754 34:-0.456419 35:-0.0476633 36:-0.379772
+-1024 1:-0.724495 2:-0.196266 3:-0.424344 4:0.241647 5:-0.72136 6:-0.163698 7:-0.440921 8:0.418335 9:-0.753374 10:-0.194784 11:-0.375015 12:-0.311519 13:-0.536497 14:-0.619498 15:-0.376064 16:-0.372428 17:-0.516145 18:-0.636254 19:-0.0652055 20:0.111369 21:0.0799922 22:-0.0700022 23:-0.235069 24:0.237539 25:0.0464758 26:-0.0485833 27:-0.31272 28:0.0485652 29:0.146493 30:-0.387074 31:-0.0848952 32:-0.369644 33:0.149041 34:-0.451944 35:-0.0507316 36:-0.378514
+-1024 1:-0.805431 2:-0.899127 3:-0.748239 4:0.0278828 5:-0.997919 6:-0.942639 7:-0.750891 8:0.156241 9:-0.998008 10:-0.938989 11:-0.663118 12:0.441529 13:-0.997811 14:-0.963019 15:-0.65981 16:0.452538 17:-0.997837 18:-0.963695 19:-0.484842 20:-0.730614 21:-0.482357 22:0.389929 23:-0.979453 24:-0.795887 25:-0.501792 26:0.509329 27:-0.980737 28:-0.772416 29:-0.301911 30:0.522928 31:-0.966786 32:-0.902165 33:-0.307007 34:0.46132 35:-0.961456 36:-0.908203
+-1024 1:-0.789078 2:-0.958624 3:-0.785013 4:-0.107822 5:-0.99813 6:-0.979656 7:-0.770707 8:-0.00872991 9:-0.998042 10:-0.980021 11:-0.710372 12:0.271785 13:-0.998134 14:-0.98633 15:-0.710152 16:0.287158 17:-0.998203 18:-0.986653 19:-0.541668 20:-0.873486 21:-0.668238 22:0.290195 23:-0.995232 24:-0.915756 25:-0.661505 26:0.34364 27:-0.993338 28:-0.916841 29:-0.546497 30:0.475288 31:-0.991194 32:-0.956866 33:-0.561784 34:0.449075 35:-0.989695 36:-0.957617
+1024 1:-0.760719 2:-0.614328 3:-0.570026 4:0.0326506 5:-0.928839 6:-0.762259 7:-0.569722 8:0.212304 9:-0.932712 10:-0.736548 11:-0.564034 12:0.143912 13:-0.889755 14:-0.851076 15:-0.556076 16:0.0146531 17:-0.874295 18:-0.875539 19:-0.341841 20:-0.498552 21:-0.38118 22:0.0678877 23:-0.820119 24:-0.661306 25:-0.430278 26:0.250733 27:-0.829484 28:-0.57171 29:-0.355415 30:0.19132 31:-0.772863 32:-0.760365 33:-0.352867 34:-0.0119635 35:-0.728973 36:-0.797848
+-1024 1:-0.794667 2:-0.765343 3:-0.660547 4:0.172345 5:-0.990273 6:-0.854261 7:-0.644737 8:0.308505 9:-0.990109 10:-0.854152 11:-0.484768 12:0.443727 13:-0.983651 14:-0.928196 15:-0.479799 16:0.402478 17:-0.981312 18:-0.934141 19:-0.362448 20:-0.586659 21:-0.414121 22:0.219252 23:-0.900162 24:-0.697773 25:-0.446965 26:0.286176 27:-0.887063 28:-0.668651 29:-0.416562 30:0.281983 31:-0.845869 32:-0.807841 33:-0.414014 34:0.0716439 35:-0.807247 36:-0.843342
+593.7566519540585 1:-0.905408 2:-0.949906 3:-1 4:-0.260688 5:-0.969866 6:-0.968299 7:-1 8:-0.145098 9:-0.975337 10:-0.963128 11:-1 12:-0.0303933 13:-0.96032 14:-0.986762 15:-1 16:-0.0173701 17:-0.961051 18:-0.98832 19:-0.710893 20:-0.898549 21:-0.811766 22:0.0979849 23:-0.974468 24:-0.940131 25:-0.83552 26:0.138333 27:-0.970712 28:-0.942295 29:-0.801274 30:0.219449 31:-0.965198 32:-0.966731 33:-0.796179 34:0.225251 35:-0.967156 36:-0.96678
+1024 1:-0.940595 2:-0.771626 3:-0.953325 4:-0.235785 5:-0.848847 6:-0.773613 7:-0.951876 8:-0.193456 9:-0.843973 10:-0.801612 11:-0.949696 12:-0.278545 13:-0.786546 14:-0.924855 15:-0.952708 16:-0.264081 17:-0.785174 18:-0.92715 19:-0.932563 20:-0.801568 21:-1 22:-0.00298522 23:-0.851626 24:-0.810514 25:-1 26:-0.163528 27:-0.785404 28:-0.918256 29:-1 30:-0.095694 31:-0.775564 32:-0.930924 33:-1 34:-0.0959015 35:-0.780642 36:-0.932215
+-560.4287232118101 1:-0.754923 2:-0.692452 3:-0.554468 4:0.517034 5:-0.999413 6:-0.765733 7:-0.552737 8:0.655949 9:-0.999307 10:-0.780739 11:-0.396356 12:1 13:-0.998428 14:-0.862354 15:-0.397422 16:1 17:-0.998377 18:-0.862905 19:0.447419 20:-0.232235 21:0.618812 22:0.985604 23:-0.928606 24:-0.14844 25:0.370671 26:1 27:-0.900835 28:-0.103302 29:0.903181 30:0.863324 31:-0.838881 32:-0.587242 33:0.89299 34:0.834857 35:-0.835305 36:-0.593345
+-1024 1:-0.580434 2:-0.139043 3:-0.227744 4:0.493748 5:-0.787276 6:-0.108026 7:-0.241352 8:0.501593 9:-0.765378 10:-0.189101 11:-0.131119 12:-0.103709 13:-0.605195 14:-0.592439 15:-0.128931 16:-0.10546 17:-0.604433 18:-0.600426 19:0.450541 20:0.218261 21:0.418813 22:0.21464 23:-0.361064 24:0.419214 25:0.394507 26:0.0971246 27:-0.310241 28:0.251413 29:0.538849 30:-0.352065 31:-0.043609 32:-0.286898 33:0.561779 34:-0.395714 35:-0.0441158 36:-0.313152
+-1024 1:-0.583331 2:-0.127617 3:-0.227744 4:0.503776 5:-0.782356 6:-0.0857205 7:-0.242767 8:0.494247 9:-0.755969 10:-0.173569 11:-0.131119 12:-0.118463 13:-0.592031 14:-0.58362 15:-0.130456 16:-0.124436 17:-0.590665 18:-0.592981 19:0.408708 20:0.232556 21:0.407048 22:0.20059 23:-0.345009 24:0.431527 25:0.377822 26:0.0746323 27:-0.28447 28:0.268747 29:0.513371 30:-0.364486 31:-0.0213345 32:-0.274851 33:0.538849 34:-0.401994 35:-0.0231168 36:-0.298781
+1024 1:-0.864011 2:-0.93833 3:-0.7935 4:-0.00530291 5:-1 6:-0.964581 7:-0.810337 8:0.116447 9:-1 10:-0.96047 11:-0.756104 12:0.419885 13:-1 14:-0.974211 15:-0.760494 16:0.432117 17:-1 18:-0.974301 19:-0.44675 20:-0.741861 21:-0.385886 22:0.61002 23:-0.999661 24:-0.803251 25:-0.423126 26:0.71649 27:-0.999633 28:-0.787683 29:-0.20255 30:0.952385 31:-0.998665 32:-0.889487 33:-0.210193 34:0.949276 35:-0.998649 36:-0.888964
+1024 1:-0.665914 2:0.0443828 3:-0.355038 4:-0.0280227 5:-0.468328 6:0.0195746 7:-0.358828 8:0.495764 9:-0.558992 10:0.281993 11:-0.228677 12:-0.938944 13:-0.242599 14:-0.610539 15:-0.232665 16:-1 17:-0.220113 18:-0.632834 19:-0.0364805 20:0.187231 21:0.112934 22:-0.604194 23:-0.0637599 24:-0.207913 25:0.0369406 26:-0.0653387 27:-0.183666 28:0.243847 29:0.212736 30:-0.670873 31:0.0420273 32:-0.421585 33:0.210188 34:-0.656646 35:0.0400794 36:-0.415805
+1024 1:-0.606926 2:-0.209617 3:-0.224915 4:0.573262 5:-0.877498 6:-0.280808 7:-0.218705 8:1 9:-0.907003 10:-0.175217 11:0.0228412 12:0.419372 13:-0.806961 14:-0.645456 15:0.0114171 16:0.376512 17:-0.800115 18:-0.655893 19:0.459907 20:0.210909 21:0.430578 22:0.0803932 23:-0.379959 24:0.190826 25:0.377822 26:0.491399 27:-0.455378 28:0.500665 29:0.561779 30:-0.196129 31:-0.12929 32:-0.274622 33:0.564327 34:-0.340121 35:-0.0850008 36:-0.318368
+-1024 1:-0.571534 2:-0.121821 3:-0.233402 4:0.248471 5:-0.73259 6:-0.213509 7:-0.237105 8:0.72551 9:-0.791707 10:-0.0590546 11:-0.123497 12:-0.12385 13:-0.586355 14:-0.579242 15:-0.119777 16:-0.244481 17:-0.562987 18:-0.613513 19:0.443673 20:0.228587 21:0.418813 22:-0.14071 23:-0.250983 24:0.125993 25:0.337298 26:0.263145 27:-0.333138 28:0.435867 29:0.543944 30:-0.332611 31:-0.0383058 32:-0.269958 33:0.569422 34:-0.505071 35:0.00422134 36:-0.332803
+-1024 1:-0.570499 2:-0.109587 3:-0.229159 4:0.264349 5:-0.7276 6:-0.188777 7:-0.234275 8:0.725869 9:-0.785292 10:-0.0426197 11:-0.121973 12:-0.137396 13:-0.571991 14:-0.568677 15:-0.119777 16:-0.254349 17:-0.549016 18:-0.602761 19:0.401215 20:0.243992 21:0.399989 22:-0.168869 23:-0.224365 24:0.132503 25:0.318227 26:0.220604 27:-0.300036 28:0.441366 29:0.518466 30:-0.34413 31:-0.0156143 32:-0.256829 33:0.543944 34:-0.525249 35:0.0306139 36:-0.322783
+1024 1:-0.854075 2:-0.647136 3:-0.701563 4:0.144273 5:-0.950712 6:-0.72517 7:-0.702767 8:0.218891 9:-0.950514 10:-0.757212 11:-0.568607 12:0.207114 13:-0.934875 14:-0.895213 15:-0.572855 16:0.201427 17:-0.933062 18:-0.898245 19:-0.586005 20:-0.45675 21:-0.491769 22:0.0809471 23:-0.762309 24:-0.537251 25:-0.511327 26:0.00130638 27:-0.732177 28:-0.625405 29:-0.459875 30:-0.0651721 31:-0.663583 32:-0.766107 33:-0.46497 34:-0.106841 35:-0.65624 36:-0.77716
+1024 1:-0.832547 2:-0.621236 3:-0.650646 4:0.241988 5:-0.955794 6:-0.685012 7:-0.661722 8:0.283815 9:-0.947967 10:-0.71963 11:-0.524402 12:0.234623 13:-0.924709 14:-0.873693 15:-0.517937 16:0.247044 17:-0.926243 18:-0.876905 19:-0.472977 20:-0.322609 21:-0.404709 22:0.156427 23:-0.681596 24:-0.28773 25:-0.415975 26:0.0357668 27:-0.648661 28:-0.445709 29:-0.307007 30:-0.186519 31:-0.557452 32:-0.716164 33:-0.289175 34:-0.322837 35:-0.527899 36:-0.749594
+1024 1:-0.838343 2:-0.877729 3:-0.734094 4:0.0629112 5:-0.996905 6:-0.925564 7:-0.72966 8:0.148957 9:-0.996157 10:-0.933259 11:-0.650923 12:0.414316 13:-0.994985 14:-0.957901 15:-0.653708 16:0.416208 17:-0.994592 18:-0.958698 19:-0.544166 20:-0.654295 21:-0.465886 22:0.453165 23:-0.96028 24:-0.666692 25:-0.508943 26:0.423748 27:-0.94986 28:-0.719338 29:-0.329937 30:0.351042 31:-0.919481 32:-0.877744 33:-0.340128 34:0.309414 35:-0.913598 36:-0.88173
+-1024 1:-0.86132 2:-0.593092 3:-0.687419 4:0.0684878 5:-0.915366 6:-0.690781 7:-0.69569 8:0.0410025 9:-0.889285 10:-0.727963 11:-0.638728 12:-0.0879217 13:-0.841629 14:-0.867239 15:-0.639978 16:-0.0937201 17:-0.839113 18:-0.872573 19:-0.58538 20:-0.432496 21:-0.463533 22:-0.0274368 23:-0.709977 24:-0.558208 25:-0.485105 26:-0.072205 27:-0.692016 28:-0.628484 29:-0.439492 30:-0.166227 31:-0.607048 32:-0.759791 33:-0.444588 34:-0.183387 35:-0.607752 36:-0.767429
+1024 1:-0.866495 2:-0.525055 3:-0.691662 4:-0.107767 5:-0.826631 6:-0.645624 7:-0.697106 8:-0.0563027 9:-0.821658 10:-0.669927 11:-0.673789 12:-0.293064 13:-0.720388 14:-0.821296 15:-0.675066 16:-0.311902 17:-0.714012 18:-0.830723 19:-0.598494 20:-0.41066 21:-0.472945 22:-0.118628 23:-0.650756 24:-0.55885 25:-0.499408 26:-0.157825 27:-0.63492 28:-0.626133 29:-0.454779 30:-0.172969 31:-0.576705 32:-0.738366 33:-0.457327 34:-0.197374 35:-0.575223 36:-0.747673
+1024 1:-0.823233 2:-0.941704 3:-0.752482 4:-0.0376373 5:-0.999281 6:-0.970196 7:-0.750891 8:0.0502977 9:-0.998513 10:-0.970545 11:-0.702751 12:0.351044 13:-0.998716 14:-0.978926 15:-0.705575 16:0.365926 17:-0.998763 18:-0.979081 19:-0.38368 20:-0.758562 21:-0.385886 22:0.565479 23:-0.99902 24:-0.820197 25:-0.37545 26:0.643479 27:-0.998528 28:-0.821748 29:-0.192359 30:0.884945 31:-0.998117 32:-0.903671 33:-0.197454 34:0.879744 35:-0.998028 36:-0.903369
+92.40316338964294 1:-0.826958 2:-0.940589 3:-0.749653 4:-0.041876 5:-0.999017 6:-0.96979 7:-0.750891 8:0.066065 9:-0.998937 10:-0.969231 11:-0.705799 12:0.358786 13:-0.998815 14:-0.978213 15:-0.708626 16:0.377102 17:-0.998961 18:-0.978307 19:-0.379308 20:-0.75246 21:-0.371768 22:0.57579 23:-0.998965 24:-0.814465 25:-0.37545 26:0.665729 27:-0.998999 28:-0.815238 29:-0.187263 30:0.901531 31:-0.998181 32:-0.900107 33:-0.192359 34:0.898383 35:-0.998202 36:-0.900051
+1024 1:-0.670675 2:-0.250797 3:-0.394641 4:0.129027 5:-0.764024 6:-0.364354 7:-0.38289 8:0.427881 9:-0.79649 10:-0.296574 11:-0.333857 12:-0.154426 13:-0.644587 14:-0.664765 15:-0.334875 16:-0.109962 17:-0.648314 18:-0.658607 19:0.117744 20:0.0731455 21:0.152934 22:-0.0384084 23:-0.417772 24:-0.0385507 25:0.13706 26:0.138906 27:-0.433772 28:0.0749498 29:0.245857 30:-0.116002 31:-0.258895 32:-0.36735 33:0.235666 34:-0.0878559 35:-0.263725 36:-0.356591
+-1024 1:-0.664673 2:-0.243077 3:-0.380497 4:0.158631 5:-0.769753 6:-0.352035 7:-0.372982 8:0.414937 9:-0.790136 10:-0.294597 11:-0.324711 12:-0.152549 13:-0.640664 14:-0.659272 15:-0.328772 16:-0.110697 17:-0.642328 18:-0.651966 19:0.135227 20:0.0863415 21:0.169405 22:-0.0526607 23:-0.404037 24:-0.0320704 25:0.156131 26:0.120943 27:-0.417679 28:0.0821674 29:0.263692 30:-0.120029 31:-0.246684 32:-0.356835 33:0.248405 34:-0.0785044 35:-0.25187 36:-0.338543
+1024 1:-0.762375 2:-0.990566 3:-0.650646 4:-0.186411 5:-0.998983 6:-0.996573 7:-0.661722 8:-0.0898684 9:-0.999041 10:-0.996165 11:-0.614339 12:0.182091 13:-0.999069 14:-0.996685 15:-0.617095 16:0.197924 17:-0.999062 18:-0.996835 19:-0.608485 20:-0.939553 21:-0.670591 22:0.240269 23:-0.999584 24:-0.967498 25:-0.704413 26:0.312944 27:-0.99971 28:-0.966577 29:-0.607644 30:0.492664 31:-0.999647 32:-0.979282 33:-0.610192 34:0.487633 35:-0.999666 36:-0.979432
+-363.4232231995459 1:-0.770448 2:-0.660182 3:-0.56154 4:0.42295 5:-0.98993 6:-0.733229 7:-0.49329 8:0.509256 9:-0.986257 10:-0.766024 11:-0.274408 12:0.674701 13:-0.981029 14:-0.877447 15:-0.290636 16:0.619421 17:-0.976215 18:-0.87974 19:0.305684 20:-0.281215 21:0.26352 22:0.525545 23:-0.842543 24:-0.29916 25:0.0726977 26:0.356822 27:-0.730093 28:-0.270919 29:0.421653 30:0.159259 31:-0.715234 32:-0.703956 33:0.398723 34:0.179952 35:-0.709254 36:-0.688946
+-1024 1:-0.853661 2:-0.921346 3:-0.743995 4:0.0234861 5:-0.99973 6:-0.957582 7:-0.756553 8:0.146933 9:-0.999798 10:-0.954984 11:-0.684458 12:0.452416 13:-0.999747 14:-0.970588 15:-0.688794 16:0.465439 17:-0.999745 18:-0.970453 19:-0.28439 20:-0.713188 21:-0.218828 22:0.653699 23:-0.999245 24:-0.788337 25:-0.246727 26:0.757406 27:-0.999348 28:-0.781227 29:0.0420361 30:1 31:-0.998232 32:-0.885743 33:0.0318449 34:1 35:-0.998218 36:-0.884448
+-85.2886875628443 1:-0.786801 2:-0.565121 3:-0.630844 4:-0.310263 5:-0.765908 6:-0.700519 7:-0.644737 8:-0.034395 9:-0.809966 10:-0.638182 11:-0.568607 12:-0.518703 13:-0.706653 14:-0.870393 15:-0.563702 16:-0.429571 17:-0.72205 18:-0.863131 19:0.268221 20:-0.282858 21:0.235285 22:0.428356 23:-0.821395 24:-0.341018 25:0.206191 26:0.788653 27:-0.866405 28:-0.186331 29:0.531205 30:0.362513 31:-0.774095 32:-0.693486 33:0.475156 34:0.309431 35:-0.752303 36:-0.687093
+1024 1:-0.839378 2:-0.994169 3:-0.782184 4:-0.150573 5:-0.999705 6:-0.99368 7:-0.806091 8:-0.0503316 9:-0.999736 10:-0.992801 11:-0.759152 12:0.230053 13:-0.999756 14:-0.994582 15:-0.763545 16:0.247121 17:-0.999774 18:-0.994643 19:-0.661563 20:-0.918176 21:-0.65412 22:0.296655 23:-0.999989 24:-0.952871 25:-0.697262 26:0.370515 27:-0.999976 28:-0.949716 29:-0.592358 30:0.564389 31:-0.999958 32:-0.970644 33:-0.600001 34:0.560994 35:-0.999971 36:-0.970359
+-1024 1:-0.547318 2:-0.136946 3:-0.251788 4:0.336443 5:-0.717715 6:-0.0943306 7:-0.258335 8:0.165794 9:-0.61408 10:-0.165501 11:-0.0686211 12:-0.706728 13:-0.469614 14:-0.692957 15:-0.060283 16:-0.439986 17:-0.523808 18:-0.650686 19:0.527341 20:0.0915633 21:0.352933 22:0.0542958 23:-0.359547 24:0.193903 25:0.277702 26:0.160842 27:-0.309725 28:0.338851 29:0.717192 30:-0.974295 31:-0.0427716 32:-0.598516 33:0.699357 34:-0.755337 35:-0.0961934 36:-0.539437
+1024 1:-0.646665 2:-0.99263 3:-0.302706 4:-0.252186 5:-0.998029 6:-1 7:-0.29372 8:-0.159863 9:-0.998381 10:-1 11:-0.224104 12:0.099947 13:-0.998484 14:-0.999456 15:-0.220461 16:0.117464 17:-0.998525 18:-0.99964 19:-0.499204 20:-1 21:-0.635297 22:0.0949278 23:-0.999346 24:-0.999719 25:-0.649586 26:0.153886 27:-0.999382 28:-1 29:-0.592358 30:0.290892 31:-0.999378 32:-0.9984 33:-0.592358 34:0.286745 35:-0.999418 36:-0.998409
+1024 1:-0.699241 2:-0.296544 3:-0.425758 4:0.176583 5:-0.805924 6:-0.406148 7:-0.41969 8:0.408722 9:-0.811545 10:-0.339746 11:-0.373491 12:-0.187313 13:-0.666097 14:-0.703226 15:-0.37759 16:-0.173881 17:-0.666691 18:-0.706896 19:0.000359672 20:-0.016224 21:0.0799922 22:-0.111257 23:-0.456817 24:-0.198441 25:-0.0202683 26:0.186329 27:-0.486066 28:0.0376019 29:0.154137 30:-0.221656 31:-0.321076 32:-0.49142 33:0.13885 34:-0.163934 35:-0.334299 36:-0.473573
+1024 1:-0.672332 2:-0.300708 3:-0.379083 4:0.290584 5:-0.849156 6:-0.416318 7:-0.370151 8:0.5401 9:-0.857777 10:-0.35892 11:-0.243921 12:-0.0426184 13:-0.758106 14:-0.74811 15:-0.255548 16:-0.0147556 17:-0.758968 18:-0.746032 19:0.176437 20:0.0208998 21:0.211758 22:-0.0184466 23:-0.482698 24:-0.128111 25:0.158515 26:0.289227 27:-0.524757 28:0.0870587 29:0.307002 30:-0.218506 31:-0.308968 32:-0.47128 33:0.317193 34:-0.161531 35:-0.330134 36:-0.461037
+420.8736318819327 1:-0.688891 2:-0.285435 3:-0.404542 4:0.194669 5:-0.80798 6:-0.398448 7:-0.40129 8:0.425024 9:-0.811528 10:-0.330121 11:-0.358247 12:-0.19144 13:-0.65992 14:-0.697504 15:-0.359283 16:-0.166815 17:-0.663664 18:-0.699947 19:0.017218 20:-0.00853033 21:0.0964629 22:-0.1019 23:-0.457587 24:-0.187885 25:0.00356975 26:0.200905 27:-0.489153 28:0.0500097 29:0.179615 30:-0.256584 31:-0.306363 32:-0.495156 33:0.159232 34:-0.181033 35:-0.321032 36:-0.469312
+1024 1:-0.8785 2:-0.797696 3:-0.821786 4:0.177952 5:-0.99159 6:-0.816402 7:-0.808921 8:0.25532 9:-0.990367 10:-0.845708 11:-0.673789 12:0.370915 13:-0.98405 14:-0.937132 15:-0.676591 16:0.388055 17:-0.98438 18:-0.937118 19:-0.772711 20:-0.656256 21:-0.774119 22:0.0786143 23:-0.83912 24:-0.685375 25:-0.787844 26:0.0145556 27:-0.814389 28:-0.764287 29:-0.740127 30:-0.10619 31:-0.767316 32:-0.8918 33:-0.747771 34:-0.122182 35:-0.764059 36:-0.894558
+1024 1:-0.642112 2:-0.323698 3:-0.312607 4:0.638247 5:-0.91429 6:-0.32088 7:-0.305044 8:0.318444 9:-0.851671 10:-0.514548 11:-0.143314 12:0.176078 13:-0.809001 14:-0.735082 15:-0.139609 16:0.200948 17:-0.813656 18:-0.738009 19:0.270094 20:-0.00398711 21:0.249403 22:0.501833 23:-0.660755 24:0.163715 25:0.284854 26:-0.0615235 27:-0.4893 28:-0.253047 29:0.380888 30:0.00792247 31:-0.406324 32:-0.444173 33:0.380888 34:-0.176067 35:-0.359167 36:-0.494444
+1024 1:-0.67171 2:-0.392576 3:-0.35221 4:0.736989 5:-0.959184 6:-0.403212 7:-0.299382 8:0.557946 9:-0.931753 10:-0.56785 11:-0.0564261 12:0.506981 13:-0.903924 14:-0.770674 15:-0.0419766 16:0.532732 17:-0.908037 18:-0.773491 19:0.220768 20:-0.0359529 21:0.202346 22:0.518019 23:-0.689173 24:0.117786 25:0.251481 26:-0.0407451 27:-0.523895 28:-0.29035 29:0.307002 30:0.0603067 31:-0.443167 32:-0.456221 33:0.324836 34:-0.13463 35:-0.395511 36:-0.510122
+1024 1:-0.636937 2:-0.0962408 3:-0.299877 4:0.582609 5:-0.768076 6:0.0515451 7:-0.320613 8:0.0262981 9:-0.61801 10:-0.284663 11:-0.185996 12:-0.427175 13:-0.519606 14:-0.638811 15:-0.193003 16:-0.4017 17:-0.522188 18:-0.641097 19:0.323167 20:0.154015 21:0.315285 22:0.326672 23:-0.478139 24:0.34021 25:0.33253 26:-0.346191 27:-0.239824 28:-0.169963 29:0.442035 30:-0.115889 31:-0.204859 32:-0.306002 33:0.449678 34:-0.238891 35:-0.167202 36:-0.342115
+-1024 1:-0.63466 2:-0.251883 3:-0.318265 4:0.57969 5:-0.865465 6:-0.216675 7:-0.316367 8:0.0745275 9:-0.75046 10:-0.478455 11:-0.23325 12:-0.096932 13:-0.679829 14:-0.677913 15:-0.238767 16:-0.1385 17:-0.670302 18:-0.691282 19:0.202661 20:0.0301317 21:0.188228 22:0.34665 23:-0.572386 24:0.16632 25:0.261016 26:-0.294829 27:-0.371753 28:-0.311106 29:0.327384 30:-0.0292451 31:-0.343341 32:-0.400427 33:0.345219 34:-0.234731 35:-0.292893 36:-0.463568
+1024 1:-0.81371 2:-0.685242 3:-0.656304 4:0.322882 5:-0.986361 6:-0.75015 7:-0.629168 8:0.425911 9:-0.98488 10:-0.775371 11:-0.463427 12:0.505496 13:-0.973642 14:-0.891718 15:-0.459967 16:0.506981 17:-0.973637 18:-0.894419 19:-0.295006 20:-0.377304 21:-0.275299 22:0.177752 23:-0.770754 24:-0.45523 25:-0.272949 26:0.0501061 27:-0.707946 28:-0.529475 29:-0.200002 30:0.0894027 31:-0.69023 32:-0.713288 33:-0.220384 34:-0.0495953 35:-0.655082 36:-0.738687
+1024 1:-0.759063 2:-0.518419 3:-0.541738 4:0.340171 5:-0.944045 6:-0.603381 7:-0.544245 8:0.314429 9:-0.917076 10:-0.63998 11:-0.451232 12:0.175166 13:-0.883212 14:-0.833445 15:-0.461492 16:0.143669 17:-0.876034 18:-0.838532 19:-0.150132 20:-0.226153 21:-0.117654 22:0.0174613 23:-0.634178 24:-0.365549 25:-0.122772 26:-0.0157974 27:-0.596572 28:-0.399345 29:-0.0751625 30:-0.00538503 31:-0.530375 32:-0.591065 33:-0.0624236 34:-0.0954097 35:-0.513718 36:-0.618878
+1024 1:-0.749748 2:-0.39892 3:-0.519108 4:0.124983 5:-0.835676 6:-0.498718 7:-0.518767 8:0.115682 9:-0.807276 10:-0.539076 11:-0.472574 12:-0.162324 13:-0.719948 14:-0.756036 15:-0.473697 16:-0.273419 17:-0.695678 18:-0.780125 19:-0.13452 20:-0.179221 21:-0.0823593 22:-0.0937454 23:-0.5616 24:-0.361182 25:-0.0822472 26:-0.109851 27:-0.523935 28:-0.375656 29:-0.019111 30:-0.0990692 31:-0.458733 32:-0.566058 33:-0.0242066 34:-0.189449 35:-0.434917 36:-0.58972
+1024 1:-0.763824 2:-0.471697 3:-0.54881 4:0.25893 5:-0.906697 6:-0.553891 7:-0.558399 8:0.148219 9:-0.857488 10:-0.611172 11:-0.507634 12:0.0100442 13:-0.811446 14:-0.797215 15:-0.511835 16:-0.0652854 17:-0.795467 18:-0.811553 19:-0.138891 20:-0.203565 21:-0.0847123 22:0.00548311 23:-0.622375 24:-0.356906 25:-0.0989338 26:-0.112419 27:-0.545575 28:-0.413188 29:-0.0394934 30:-0.0158893 31:-0.506625 32:-0.571237 33:-0.0420412 34:-0.124762 35:-0.47897 36:-0.599384
+-1024 1:-0.790527 2:-0.844601 3:-0.681763 4:0.164439 5:-0.998812 6:-0.906259 7:-0.681538 8:0.298215 9:-0.998889 10:-0.90609 11:-0.570132 12:0.597284 13:-0.998318 14:-0.942121 15:-0.563702 16:0.604281 17:-0.998337 18:-0.943354 19:-0.273775 20:-0.561211 21:-0.202358 22:0.560707 23:-0.960333 24:-0.610868 25:-0.203819 26:0.634224 27:-0.958359 28:-0.617471 29:0.0649663 30:0.60643 31:-0.933225 32:-0.826903 33:0.0649663 34:0.532069 35:-0.926692 36:-0.838819
+1024 1:-0.762168 2:-0.986918 3:-0.565783 4:-0.217048 5:-0.998085 6:-0.997675 7:-0.571137 8:-0.12359 9:-0.998163 10:-0.997338 11:-0.528975 12:0.144729 13:-0.998367 14:-0.99753 15:-0.528616 16:0.162474 17:-0.99843 18:-0.997717 19:-0.538545 20:-0.953275 21:-0.602355 22:0.188516 23:-0.998938 24:-0.980328 25:-0.628131 26:0.25784 27:-0.998967 28:-0.979309 29:-0.541402 30:0.424676 31:-0.998983 32:-0.986154 33:-0.538854 34:0.419738 35:-0.999038 36:-0.986434
+1024 1:-0.780177 2:-0.962001 3:-0.673276 4:-0.120586 5:-0.998447 6:-0.985993 7:-0.682953 8:-0.0166373 9:-0.998468 10:-0.985093 11:-0.628058 12:0.273036 13:-0.998661 14:-0.98877 15:-0.626249 16:0.285355 17:-0.998656 18:-0.989172 19:-0.46486 20:-0.855686 21:-0.491769 22:0.379597 23:-0.998949 24:-0.913688 25:-0.532781 26:0.464423 27:-0.998893 28:-0.907108 29:-0.378345 30:0.666668 31:-0.998669 32:-0.949505 33:-0.357963 34:0.654784 35:-0.99864 36:-0.951611
+-1024 1:-0.774588 2:-0.870323 3:-0.671862 4:0.0840803 5:-0.997596 6:-0.92859 7:-0.646152 8:0.200612 9:-0.997809 10:-0.933892 11:-0.551841 12:0.500656 13:-0.997488 14:-0.956219 15:-0.546922 16:0.508151 17:-0.997486 18:-0.957226 19:-0.254417 20:-0.649188 21:-0.254123 22:0.5329 23:-0.977356 24:-0.705689 25:-0.194284 26:0.505109 27:-0.968767 28:-0.756815 29:0.0547751 30:0.570785 31:-0.957116 32:-0.879614 33:0.0649663 34:0.49135 35:-0.951486 36:-0.890304
+1024 1:-0.776037 2:-0.619512 3:-0.623772 4:0.354304 5:-0.967097 6:-0.6522 7:-0.573968 8:0.0729027 9:-0.931896 10:-0.806465 11:-0.528975 12:0.568533 13:-0.956743 14:-0.831549 15:-0.571329 16:-0.0461637 17:-0.882972 18:-0.901823 19:-0.218198 20:-0.385046 21:-0.242358 22:0.349223 23:-0.823902 24:-0.392188 25:-0.277717 26:-0.292138 27:-0.613065 28:-0.684236 29:-0.220384 30:0.166589 31:-0.689877 32:-0.678006 33:-0.182167 34:-0.282432 35:-0.598907 36:-0.78189
+1024 1:-0.789492 2:-0.520955 3:-0.636502 4:0.322529 5:-0.917725 6:-0.505585 7:-0.613598 8:-0.177648 9:-0.818645 10:-0.742335 11:-0.59757 12:0.289354 13:-0.866101 14:-0.765921 15:-0.58811 16:-0.372283 17:-0.753708 18:-0.874228 19:-0.250046 20:-0.317206 21:-0.261182 22:0.240679 23:-0.742832 24:-0.323254 25:-0.284868 26:-0.440133 27:-0.486255 28:-0.652936 29:-0.235671 30:-0.0233405 31:-0.565431 32:-0.641354 33:-0.200002 34:-0.291534 35:-0.513409 36:-0.717434
+-1024 1:-0.781005 2:-0.833614 3:-0.640745 4:0.192668 5:-0.999051 6:-0.902909 7:-0.665968 8:0.32914 9:-0.998987 10:-0.897923 11:-0.518305 12:0.633646 13:-0.998568 14:-0.939508 15:-0.519463 16:0.640593 17:-0.998563 18:-0.940065 19:-0.0801924 20:-0.521967 21:0.0305825 22:0.655249 23:-0.963764 24:-0.572955 25:-0.0393387 26:0.759969 27:-0.964372 28:-0.560179 29:0.416557 30:0.667812 31:-0.933295 32:-0.815479 33:0.40127 34:0.638798 35:-0.931381 36:-0.820247
+1024 1:-0.871048 2:-0.932694 3:-0.953325 4:-0.274596 5:-0.967327 6:-0.968658 7:-0.961784 8:-0.109434 9:-0.977528 10:-0.953302 11:-0.948171 12:-0.0272393 13:-0.962188 14:-0.984254 15:-0.954234 16:-0.0263414 17:-0.959457 18:-0.985636 19:-0.67093 20:-0.891976 21:-0.83059 22:0.0395268 23:-0.963926 24:-0.949061 25:-0.868893 26:0.164826 27:-0.971109 28:-0.929105 29:-0.8293 30:0.201099 31:-0.961438 32:-0.966938 33:-0.821657 34:0.214126 35:-0.964872 36:-0.966784
+-1024 1:-0.738156 2:-0.531227 3:-0.497891 4:0.471436 5:-0.967465 6:-0.605359 7:-0.501782 8:0.583293 9:-0.96231 10:-0.61778 11:-0.219531 12:0.498449 13:-0.944463 14:-0.842381 15:-0.235716 16:0.416567 17:-0.93647 18:-0.85287 19:0.0746619 20:-0.173073 21:0.00470003 22:0.263682 23:-0.659122 24:-0.126384 25:-0.048874 26:0.250221 27:-0.623515 28:-0.156549 29:0.166876 30:-0.329421 31:-0.444466 32:-0.654494 33:0.156684 34:-0.547209 35:-0.385422 36:-0.700558
+1024 1:-0.671503 2:-0.269129 3:-0.398884 4:0.149296 5:-0.776343 6:-0.371272 7:-0.399874 8:0.529634 9:-0.822023 10:-0.269144 11:-0.231726 12:-0.32602 13:-0.682795 14:-0.764198 15:-0.24487 16:-0.405759 17:-0.665104 18:-0.78234 19:0.27946 20:0.00579062 21:0.218814 22:0.108983 23:-0.48579 24:0.0260719 25:0.160898 26:0.20554 27:-0.464399 28:0.10122 29:0.393627 30:-0.528832 31:-0.225449 32:-0.551168 33:0.40127 34:-0.630397 35:-0.202362 36:-0.580689
+-1024 1:-0.77583 2:-0.821367 3:-0.636502 4:0.227685 5:-0.99911 6:-0.889761 7:-0.646152 8:0.351062 9:-0.999001 10:-0.892796 11:-0.500012 12:0.663762 13:-0.998533 14:-0.934421 15:-0.501156 16:0.670993 17:-0.998537 18:-0.934884 19:-0.0102544 20:-0.490805 21:0.0799922 22:0.740965 23:-0.965117 24:-0.508468 25:0.0393244 26:0.75004 27:-0.957201 28:-0.545375 29:0.505727 30:0.674345 31:-0.923725 32:-0.798084 33:0.480252 34:0.633525 35:-0.920205 36:-0.80409
+1024 1:-0.734017 2:-0.163917 3:-0.455461 4:0.622903 5:-0.76722 6:0.160968 7:-0.469228 8:-0.463437 9:-0.491115 10:-0.483231 11:-0.371967 12:-1 13:-0.393569 14:-0.770171 15:-0.365385 16:-0.938987 17:-0.403595 18:-0.766185 19:-0.194469 20:-0.0152061 21:-0.0517709 22:0.294525 23:-0.514367 24:0.239933 25:-0.0679444 26:-0.763846 27:-0.123978 28:-0.502772 29:0.0777053 30:-0.736943 31:-0.136512 32:-0.607454 33:0.0751575 34:-0.787446 35:-0.125612 36:-0.623704
+-445.2283108553766 1:-0.848279 2:-0.541011 3:-0.753896 4:-0.117084 5:-0.758169 6:-0.527548 7:-0.74523 8:-0.812444 9:-0.497463 10:-0.789821 11:-0.724092 12:-0.628789 13:-0.589289 14:-0.846852 15:-0.722356 16:-0.557186 17:-0.607306 18:-0.843003 19:-0.310618 20:-0.316678 21:-0.272946 22:0.297694 23:-0.724186 24:-0.208284 25:-0.353996 26:-0.208928 27:-0.548101 28:-0.535034 29:-0.189811 30:0.00747943 31:-0.637972 32:-0.696569 33:-0.20255 34:-0.0526829 35:-0.619391 36:-0.705549
+1024 1:-0.701932 2:-0.183455 3:-0.350796 4:0.954725 5:-0.884223 6:0.123554 7:-0.324859 8:-0.397477 9:-0.578432 10:-0.536508 11:-0.288128 12:-0.538226 13:-0.50473 14:-0.677061 15:-0.292161 16:-0.41385 17:-0.526386 18:-0.657151 19:-0.0283622 20:0.0841018 21:0.141169 22:0.692781 23:-0.61429 24:0.594098 25:0.156131 26:-0.837773 27:-0.0450294 28:-0.446018 29:0.238214 30:-0.693573 31:-0.0182642 32:-0.479816 33:0.245857 34:-0.606422 35:-0.0597822 36:-0.465803
+-1024 1:-0.804189 2:-0.600868 3:-0.640745 4:0.567825 5:-0.985181 6:-0.568727 7:-0.508859 8:0.418863 9:-0.976176 10:-0.764988 11:-0.35215 12:0.52621 13:-0.961554 14:-0.864588 15:-0.360809 16:0.550778 17:-0.962075 18:-0.861674 19:-0.176359 20:-0.21273 21:-0.0541238 22:0.75289 23:-0.82534 24:0.0865032 25:-0.0989338 26:-0.198949 27:-0.54907 28:-0.501567 29:-0.0394934 30:-0.310612 31:-0.471293 32:-0.68047 33:-0.0165632 34:-0.217205 35:-0.507366 36:-0.668075
+-1024 1:-0.787008 2:-0.785785 3:-0.647817 4:0.318874 5:-0.998568 6:-0.838801 7:-0.581044 8:0.388364 9:-0.998429 10:-0.882113 11:-0.477147 12:0.734549 13:-0.998012 14:-0.917631 15:-0.484375 16:0.743506 17:-0.998012 18:-0.917043 19:-0.120158 20:-0.413048 21:0.00234707 22:0.961813 23:-0.967071 24:-0.275914 25:0.141828 26:0.629718 27:-0.927466 28:-0.541559 29:0.447131 30:0.622597 31:-0.891913 32:-0.760936 33:0.444583 34:0.66973 35:-0.898334 36:-0.754372
+1024 1:-0.78825 2:-0.802837 3:-0.611043 4:0.216988 5:-0.996535 6:-0.876011 7:-0.631998 8:0.388764 9:-0.997756 10:-0.868242 11:-0.50611 12:0.668402 13:-0.996432 14:-0.921172 15:-0.516412 16:0.68333 17:-0.996589 18:-0.920133 19:-0.254417 20:-0.534651 21:-0.124712 22:0.618116 23:-0.957823 24:-0.558679 25:-0.118004 26:0.759227 27:-0.964838 28:-0.551825 29:0.128659 30:0.589844 31:-0.919336 32:-0.810233 33:0.146493 34:0.639902 35:-0.926094 36:-0.805015
+-1024 1:-0.782454 2:-0.605793 3:-0.531837 4:0.405533 5:-0.978428 6:-0.69084 7:-0.565476 8:0.643133 9:-0.981595 10:-0.648343 11:-0.333857 12:0.570401 13:-0.959773 14:-0.849548 15:-0.334875 16:0.629904 17:-0.96332 18:-0.843536 19:-0.390549 20:-0.353812 21:-0.289417 22:0.167691 23:-0.739663 24:-0.40013 25:-0.29202 26:0.40502 27:-0.793619 28:-0.338841 29:-0.245862 30:-0.175789 31:-0.56888 32:-0.717205 33:-0.235671 34:-0.0997227 35:-0.590008 36:-0.700771
+1024 1:-0.8282 2:-0.891686 3:-0.712878 4:0.0549447 5:-0.998029 6:-0.937507 7:-0.702767 8:0.171115 9:-0.998294 10:-0.94095 11:-0.638728 12:0.472617 13:-0.998117 14:-0.960225 15:-0.644555 16:0.487842 17:-0.998229 18:-0.960129 19:-0.369317 20:-0.697079 21:-0.298829 22:0.511058 23:-0.983043 24:-0.746127 25:-0.258646 26:0.565585 27:-0.982347 28:-0.769696 29:-0.0929971 30:0.611706 31:-0.967552 32:-0.885209 33:-0.0980927 34:0.60115 35:-0.966959 36:-0.886077
+-1024 1:-0.785766 2:-0.504078 3:-0.58134 4:-0.010022 5:-0.846564 6:-0.618939 7:-0.586705 8:0.431144 9:-0.907712 10:-0.523775 11:-0.55489 12:-0.171647 13:-0.759169 14:-0.804968 15:-0.556076 16:0.0778537 17:-0.80293 18:-0.769429 19:-0.373688 20:-0.299304 21:-0.289417 22:-0.0519204 23:-0.590738 24:-0.373139 25:-0.299171 26:0.206327 27:-0.679327 28:-0.312097 29:-0.20255 30:-0.276239 31:-0.474094 32:-0.678404 33:-0.205098 34:-0.205484 35:-0.493176 36:-0.661039
+1024 1:-0.791355 2:-0.500343 3:-0.585583 4:-0.034475 5:-0.837479 6:-0.619405 7:-0.590952 8:0.432261 9:-0.906105 10:-0.517163 11:-0.560986 12:-0.184324 13:-0.751924 14:-0.802411 15:-0.562176 16:0.0521702 17:-0.794059 18:-0.768648 19:-0.38368 20:-0.300827 21:-0.298829 22:-0.0717011 23:-0.581364 24:-0.380789 25:-0.306322 26:0.198961 27:-0.676364 28:-0.314414 29:-0.20255 30:-0.280968 31:-0.477607 32:-0.683587 33:-0.210193 34:-0.225267 35:-0.487766 36:-0.666102
+-1024 1:-0.811227 2:-0.898848 3:-0.727022 4:0.0308992 5:-0.997419 6:-0.940746 7:-0.690029 8:0.127381 9:-0.997599 10:-0.950246 11:-0.631107 12:0.437037 13:-0.99764 14:-0.964299 15:-0.636927 16:0.452811 17:-0.997742 18:-0.964153 19:-0.354955 20:-0.701451 21:-0.383533 22:0.581031 23:-0.989958 24:-0.721692 25:-0.306322 26:0.544593 27:-0.983827 28:-0.786094 29:-0.131212 30:0.646521 31:-0.972554 32:-0.88519 33:-0.143951 34:0.661612 35:-0.973434 36:-0.881887
+-1024 1:-0.773139 2:-0.793738 3:-0.619529 4:0.220539 5:-0.995027 6:-0.863221 7:-0.582459 8:0.319981 9:-0.995512 10:-0.886055 11:-0.466476 12:0.60102 13:-0.993646 14:-0.925628 15:-0.475222 16:0.618165 17:-0.993846 18:-0.924563 19:-0.112664 20:-0.501152 21:-0.110595 22:0.686533 23:-0.951184 24:-0.464371 25:-0.0274197 26:0.56722 27:-0.937334 28:-0.612196 29:0.26624 30:0.576351 31:-0.910594 32:-0.803289 33:0.253501 34:0.633832 35:-0.916171 36:-0.792587
+111.2852604774131 1:-0.757821 2:-0.681294 3:-0.599727 4:0.194231 5:-0.96719 6:-0.765139 7:-0.590952 8:0.121051 9:-0.951765 10:-0.826461 11:-0.521353 12:0.285183 13:-0.948078 14:-0.896287 15:-0.520988 16:0.318566 17:-0.949689 18:-0.894183 19:0.0759107 20:-0.362687 21:-0.0258884 22:0.567285 23:-0.86691 24:-0.294888 25:0.032173 26:0.262189 27:-0.819406 28:-0.556961 29:0.222927 30:0.236397 31:-0.77637 32:-0.743988 33:0.222927 34:0.283577 35:-0.783678 36:-0.73425
+-1024 1:-0.721183 2:-0.511415 3:-0.506378 4:0.639238 5:-0.978257 6:-0.527971 7:-0.380059 8:0.575379 9:-0.970969 10:-0.681714 11:-0.118924 12:0.57962 13:-0.950956 14:-0.836321 15:-0.135033 16:0.644557 17:-0.954371 18:-0.827682 19:0.077159 20:-0.187397 21:0.0446979 22:0.593259 23:-0.786811 24:-0.029656 25:0.0726977 26:-0.0627709 27:-0.621755 28:-0.478425 29:0.133754 30:-0.204659 31:-0.499461 32:-0.648725 33:0.105728 34:0.052167 35:-0.554249 36:-0.581932
+-1024 1:-0.708349 2:-0.359908 3:-0.456875 4:0.531836 5:-0.898576 6:-0.320897 7:-0.443752 8:-0.0242468 9:-0.782904 10:-0.595036 11:-0.42532 12:-0.251749 13:-0.694519 14:-0.76 15:-0.443186 16:0.0224368 17:-0.737455 18:-0.709971 19:0.0615495 20:-0.122848 21:0.0141118 22:0.476444 23:-0.700499 24:0.0547089 25:0.0631625 26:-0.303831 27:-0.47631 28:-0.481837 29:0.141398 30:-0.279147 31:-0.401816 32:-0.595294 33:0.110824 34:-0.128729 35:-0.431318 36:-0.549356
+-1024 1:-0.766101 2:-0.853893 3:-0.654889 4:0.119188 5:-0.997009 6:-0.913805 7:-0.651814 8:0.247995 9:-0.99743 10:-0.915869 11:-0.54117 12:0.551911 13:-0.997258 14:-0.946638 15:-0.545397 16:0.563115 17:-0.997298 18:-0.946695 19:-0.296255 20:-0.636747 21:-0.308241 22:0.58926 23:-0.979199 24:-0.660738 25:-0.289636 26:0.604905 27:-0.97289 28:-0.693078 29:-0.0369456 30:0.588458 31:-0.952824 32:-0.86484 33:-0.0420412 34:0.585986 35:-0.953487 36:-0.865992
+-1024 1:-0.790527 2:-0.953699 3:-0.686006 4:-0.117327 5:-0.997377 6:-0.981583 7:-0.697106 8:-0.00183805 9:-0.998107 10:-0.981018 11:-0.637204 12:0.286929 13:-0.998262 14:-0.986116 15:-0.636927 16:0.300513 17:-0.998229 18:-0.9863 19:-0.466108 20:-0.840919 21:-0.524708 22:0.414141 23:-0.998483 24:-0.88757 25:-0.52563 26:0.480796 27:-0.998448 28:-0.894855 29:-0.37325 30:0.693235 31:-0.998223 32:-0.94157 33:-0.375797 34:0.687732 35:-0.998135 36:-0.941427
+-1024 1:-0.766515 2:-0.862133 3:-0.649232 4:0.0935854 5:-0.997141 6:-0.924599 7:-0.67163 8:0.235836 9:-0.997464 10:-0.918048 11:-0.55489 12:0.530502 13:-0.997334 14:-0.950242 15:-0.557601 16:0.540371 17:-0.997338 18:-0.950525 19:-0.299377 20:-0.656991 21:-0.296476 22:0.509471 23:-0.977319 24:-0.718585 25:-0.337312 26:0.629516 27:-0.979222 28:-0.698427 29:-0.0496846 30:0.596119 31:-0.959965 32:-0.875689 33:-0.057328 34:0.590379 35:-0.959948 36:-0.876413
+1024 1:-0.694894 2:-0.472232 3:-0.485162 4:0.204381 5:-0.887012 6:-0.558745 7:-0.456491 8:0.12472 9:-0.872286 10:-0.665792 11:-0.330808 12:-0.0794241 13:-0.833479 14:-0.846526 15:-0.340977 16:-0.117122 17:-0.825035 18:-0.853876 19:0.110876 20:-0.260749 21:-0.0188295 22:0.391367 23:-0.771054 24:-0.231505 25:-0.00119786 26:0.0504319 27:-0.670335 28:-0.457928 29:0.199997 30:-0.133941 31:-0.604439 32:-0.712409 33:0.187258 34:-0.188522 35:-0.590204 36:-0.721302
+1024 1:-0.689305 2:-0.427869 3:-0.482333 4:0.124819 5:-0.841206 6:-0.516515 7:-0.446583 8:0.0478808 9:-0.820023 10:-0.613945 11:-0.367393 12:-0.223388 13:-0.760318 14:-0.813783 15:-0.369962 16:-0.22635 17:-0.758303 18:-0.819599 19:0.0596762 20:-0.258869 21:-0.0823593 22:0.32745 23:-0.740415 24:-0.227359 25:-0.048874 26:0.0165335 27:-0.651346 28:-0.457923 29:0.131206 30:-0.160532 31:-0.583673 32:-0.706359 33:0.105728 34:-0.201662 35:-0.569176 36:-0.710791
+1024 1:-0.692203 2:-0.995182 3:-0.493648 4:-0.232337 5:-0.998607 6:-0.999697 7:-0.473474 8:-0.147962 9:-0.99862 10:-0.999765 11:-0.429893 12:0.114813 13:-0.998644 14:-0.999185 15:-0.430983 16:0.132271 17:-0.998618 18:-0.999287 19:-0.597245 20:-0.995051 21:-0.769413 22:0.129339 23:-0.999717 24:-0.995432 25:-0.740168 26:0.182958 27:-0.999638 28:-0.996802 29:-0.699365 30:0.337195 31:-0.999769 32:-0.996113 33:-0.704461 34:0.332793 35:-0.999778 36:-0.996065
+1024 1:-0.845381 2:-0.933808 3:-0.756725 4:-0.00828276 5:-0.999736 6:-0.965641 7:-0.774953 8:0.117937 9:-0.999823 10:-0.961425 11:-0.705799 12:0.415358 13:-0.999726 14:-0.975052 15:-0.708626 16:0.429187 17:-0.999756 18:-0.975195 19:-0.334347 20:-0.734047 21:-0.296476 22:0.608508 23:-0.999402 24:-0.811187 25:-0.303939 26:0.724969 27:-0.999444 28:-0.796151 29:-0.0980927 30:0.960005 31:-0.998588 32:-0.892536 33:-0.0980927 34:0.953766 35:-0.998517 36:-0.892832
+1024 1:-0.768792 2:-0.588743 3:-0.503549 4:0.455089 5:-0.981916 6:-0.682269 7:-0.531506 8:0.660843 9:-0.982454 10:-0.652122 11:-0.297274 12:0.658314 13:-0.964962 14:-0.838664 15:-0.301314 16:0.63932 17:-0.962966 18:-0.841686 19:0.313177 20:-0.17189 21:0.244697 22:0.511197 23:-0.767657 24:-0.104147 25:0.265783 26:0.465659 27:-0.737162 28:-0.18136 29:0.337575 30:0.19629 31:-0.614529 32:-0.575266 33:0.350314 34:0.101319 35:-0.592362 36:-0.595303
+-1024 1:-0.618724 2:-0.240071 3:-0.301292 4:0.285731 5:-0.803807 6:-0.326679 7:-0.307874 8:0.603522 9:-0.836141 10:-0.264714 11:-0.22258 12:-0.0813791 13:-0.683933 14:-0.676265 15:-0.218936 16:-0.0493763 17:-0.688465 18:-0.676212 19:0.403713 20:0.106009 21:0.336462 22:0.0114216 23:-0.360179 24:0.134305 25:0.311076 26:0.25902 27:-0.449977 28:0.195626 29:0.470061 30:-0.34558 31:-0.175373 32:-0.408497 33:0.500632 34:-0.366773 35:-0.174606 36:-0.418624
+1024 1:-0.797358 2:-0.992861 3:-0.739752 4:-0.16332 5:-0.999492 6:-0.994992 7:-0.721168 8:-0.0725642 9:-0.999464 10:-0.995365 11:-0.689031 12:0.207757 13:-0.999488 14:-0.995833 15:-0.688794 16:0.224111 17:-0.999507 18:-0.996005 19:-0.63721 20:-0.930257 21:-0.687062 22:0.274376 23:-1 24:-0.959707 25:-0.682959 26:0.341612 27:-1 28:-0.961756 29:-0.610192 30:0.532256 31:-1 32:-0.975808 33:-0.615288 34:0.52774 35:-1 36:-0.975643
+75.7903475269011 1:-0.741675 2:-0.704856 3:-0.514864 4:0.428137 5:-0.993668 6:-0.765921 7:-0.418275 8:0.350832 9:-0.986987 10:-0.849171 11:-0.268311 12:0.583173 13:-0.981352 14:-0.899944 15:-0.273855 16:0.599419 17:-0.98162 18:-0.899127 19:0.1883 20:-0.366766 21:0.152934 22:0.792792 23:-0.910069 24:-0.218364 25:0.115606 26:0.240366 27:-0.823332 28:-0.585114 29:0.403818 30:0.0756522 31:-0.761683 32:-0.784276 33:0.393627 34:0.12452 35:-0.768864 36:-0.774869
+1024 1:-0.751818 2:-0.495167 3:-0.565783 4:0.252387 5:-0.864095 6:-0.444857 7:-0.534337 8:-0.443966 9:-0.715258 10:-0.74923 11:-0.449708 12:-0.711394 13:-0.653328 14:-0.872349 15:-0.453865 16:-0.671711 17:-0.653823 18:-0.870125 19:0.32504 20:-0.252196 21:0.216464 22:0.695066 23:-0.828802 24:-0.0545282 25:0.160898 26:0.0371097 27:-0.691688 28:-0.504824 29:0.383436 30:-0.234287 31:-0.597596 32:-0.739752 33:0.406366 34:-0.203621 35:-0.608881 36:-0.737391
+-336.0587296318182 1:-0.683509 2:-0.456787 3:-0.39747 4:0.742158 5:-0.962983 6:-0.402264 7:-0.30929 8:0.220902 9:-0.912821 10:-0.709251 11:-0.153984 12:0.220625 13:-0.886726 14:-0.827359 15:-0.160967 16:0.258929 17:-0.887607 18:-0.821877 19:0.265099 20:-0.144592 21:0.251756 22:0.914486 23:-0.831087 24:0.224212 25:0.256248 26:-0.28476 27:-0.523243 28:-0.516066 29:0.442035 30:-0.650154 31:-0.392734 32:-0.730551 33:0.424201 34:-0.479064 35:-0.426966 36:-0.690875
+398.1934108086878 1:-0.664673 2:-0.263739 3:-0.319679 4:0.984918 5:-0.920269 6:-0.00511905 7:-0.292305 8:-0.52088 9:-0.613466 10:-0.652725 11:-0.214958 12:-0.787177 13:-0.540636 14:-0.789788 15:-0.211308 16:-0.522044 17:-0.589811 18:-0.751663 19:0.274465 20:-0.0412002 21:0.298815 22:0.941888 23:-0.768716 24:0.47816 25:0.303924 26:-0.634449 27:-0.298351 28:-0.523907 29:0.498084 30:-0.924465 31:-0.196236 32:-0.696584 33:0.485348 34:-0.816622 35:-0.220138 36:-0.670318
+-1024 1:-0.653702 2:-0.263546 3:-0.31685 4:1 5:-0.922967 6:-0.0022305 7:-0.29089 8:-0.519459 9:-0.614821 10:-0.653419 11:-0.214958 12:-0.774587 13:-0.54593 14:-0.789912 15:-0.212833 16:-0.506203 17:-0.594232 18:-0.750191 19:0.240748 20:-0.0677146 21:0.258814 22:0.917788 23:-0.778776 24:0.415319 25:0.270551 26:-0.61598 27:-0.324378 28:-0.541619 29:0.464965 30:-0.885936 31:-0.233463 32:-0.708492 33:0.454774 34:-0.775773 35:-0.254089 36:-0.679469
+-1024 1:-0.793218 2:-0.953582 3:-0.717122 4:-0.0872791 5:-0.998545 6:-0.979173 7:-0.690029 8:0.0165493 9:-0.998788 10:-0.980753 11:-0.643302 12:0.305888 13:-0.998731 14:-0.985542 15:-0.646081 16:0.322514 17:-0.998775 18:-0.985576 19:-0.416152 20:-0.821322 21:-0.458827 22:0.450443 23:-0.998727 24:-0.877054 25:-0.432662 26:0.525899 27:-0.999106 28:-0.888534 29:-0.261149 30:0.73472 31:-0.998351 32:-0.937589 33:-0.273888 34:0.73769 35:-0.998485 36:-0.9366
+1024 1:-0.679162 2:-0.444398 3:-0.45829 4:-0.0223488 5:-0.823307 6:-0.594749 7:-0.443752 8:0.317882 9:-0.87416 10:-0.54892 11:-0.356723 12:-0.288302 13:-0.754602 14:-0.827551 15:-0.368436 16:0.0520249 17:-0.8057 18:-0.779877 19:0.0428178 20:-0.219551 21:-0.115301 22:0.0988211 23:-0.612921 24:-0.230732 25:-0.0393387 26:0.231651 27:-0.669367 28:-0.271736 29:0.164328 30:-0.279131 31:-0.517757 32:-0.695855 33:0.110824 34:-0.107526 35:-0.541114 36:-0.64419
+-1024 1:-0.672952 2:-0.475256 3:-0.454046 4:-0.0746968 5:-0.833935 6:-0.650841 7:-0.429598 8:0.491492 9:-0.926715 10:-0.573089 11:-0.309469 12:-0.208591 13:-0.807875 14:-0.854838 15:-0.318095 16:0.124214 17:-0.853839 18:-0.815852 19:0.112749 20:-0.243459 21:-0.0753004 22:0.132796 23:-0.653083 24:-0.26993 25:-0.00358167 26:0.289828 27:-0.718205 28:-0.312337 29:0.205093 30:-0.230557 31:-0.561042 32:-0.711996 33:0.159232 34:-0.0805118 35:-0.580722 36:-0.668599
+-1024 1:-0.656393 2:-0.432974 3:-0.43283 4:-0.0352656 5:-0.817073 6:-0.59419 7:-0.418275 8:0.321152 9:-0.872634 10:-0.546784 11:-0.324711 12:-0.296452 13:-0.751671 14:-0.825726 15:-0.3364 16:0.0137474 17:-0.798091 18:-0.78341 19:0.100262 20:-0.22722 21:-0.0894182 22:0.125014 23:-0.633007 24:-0.238864 25:-0.00834928 26:0.190347 27:-0.67206 28:-0.322134 29:0.199997 30:-0.260789 31:-0.533038 32:-0.700013 33:0.156684 34:-0.139983 35:-0.545437 36:-0.661215
+-1024 1:-0.654116 2:-0.427508 3:-0.425758 4:-0.031708 5:-0.815786 6:-0.589737 7:-0.414029 8:0.325309 9:-0.871363 10:-0.541614 11:-0.318615 12:-0.297391 13:-0.749666 14:-0.823826 15:-0.330298 16:0.0198052 17:-0.796579 18:-0.779476 19:0.112749 20:-0.221669 21:-0.0823593 22:0.112812 23:-0.624043 24:-0.235953 25:0.00356975 26:0.186015 27:-0.668861 28:-0.320565 29:0.215284 30:-0.26327 31:-0.528684 32:-0.696648 33:0.16178 34:-0.128269 35:-0.542359 36:-0.652997
+1024 1:-0.793425 2:-0.980647 3:-0.712878 4:-0.18593 5:-0.998015 6:-0.993898 7:-0.835814 8:-0.0550299 9:-0.998348 10:-0.987708 11:-0.713421 12:0.194611 13:-0.998382 14:-0.994057 15:-0.714728 16:0.210449 17:-0.998383 18:-0.99423 19:-0.675925 20:-0.950961 21:-0.764707 22:0.146366 23:-0.996564 24:-0.97825 25:-0.966627 26:0.292334 27:-0.999398 28:-0.951196 29:-0.768153 30:0.361127 31:-0.996288 32:-0.98425 33:-0.770701 34:0.348215 35:-0.995883 36:-0.984592
+1024 1:-0.771276 2:-0.997565 3:-0.640745 4:-0.209964 5:-0.998857 6:-0.998449 7:-0.712675 8:-0.105703 9:-0.999001 10:-0.997291 11:-0.626534 12:0.157554 13:-0.999098 14:-0.998006 15:-0.633876 16:0.173641 17:-0.999071 18:-0.998119 19:-0.640957 20:-0.971937 21:-0.710589 22:0.16391 23:-0.999478 24:-0.987627 25:-0.845055 26:0.249142 27:-0.999579 28:-0.979498 29:-0.712102 30:0.398762 31:-0.999564 32:-0.989384 33:-0.714649 34:0.394376 35:-0.999582 36:-0.989381
+-1024 1:-0.736708 2:-0.393307 3:-0.512036 4:0.297662 5:-0.848117 6:-0.378583 7:-0.535752 8:-0.0573859 9:-0.743284 10:-0.548025 11:-0.434466 12:-0.356388 13:-0.68361 14:-0.786946 15:-0.450814 16:-0.366268 17:-0.665095 18:-0.782716 19:-0.0495941 20:-0.181767 21:-0.0682416 22:0.182626 23:-0.599798 24:-0.0998354 25:-0.0703282 26:-0.0130161 27:-0.537531 28:-0.294933 29:0.118467 30:-0.487709 31:-0.382458 32:-0.675922 33:0.113372 34:-0.404437 35:-0.396257 36:-0.650301
+-1024 1:-0.73857 2:-0.405513 3:-0.517693 4:0.286655 5:-0.85266 6:-0.398444 7:-0.539998 8:-0.00575111 9:-0.768912 10:-0.554154 11:-0.44361 12:-0.307035 13:-0.704126 14:-0.7901 15:-0.456916 16:-0.305622 17:-0.691092 18:-0.785997 19:-0.0639566 20:-0.188787 21:-0.0753004 22:0.192873 23:-0.605453 24:-0.0979179 25:-0.084631 26:-0.00894239 27:-0.544284 28:-0.302499 29:0.113372 30:-0.470745 31:-0.395471 32:-0.679163 33:0.0955373 34:-0.400044 35:-0.401094 36:-0.65339
+-1024 1:-0.73174 2:-0.523161 3:-0.497891 4:0.494083 5:-0.957923 6:-0.548793 7:-0.455075 8:0.538218 9:-0.959185 10:-0.644568 11:-0.271359 12:0.425098 13:-0.926964 14:-0.829627 15:-0.276906 16:0.44683 17:-0.927425 18:-0.827464 19:-0.0608345 20:-0.267078 21:-0.0753004 22:0.286818 23:-0.720056 24:-0.229537 25:-0.101318 26:0.385821 27:-0.754419 28:-0.284079 29:0.0980851 30:-0.216678 31:-0.568046 32:-0.716615 33:0.0649663 34:-0.161757 35:-0.568279 36:-0.694907
+-1024 1:-0.833168 2:-0.931481 3:-0.794914 4:-0.0248848 5:-0.998262 6:-0.957612 7:-0.787692 8:0.081047 9:-0.998216 10:-0.959015 11:-0.739335 12:0.387128 13:-0.998495 14:-0.971541 15:-0.736086 16:0.396198 17:-0.998427 18:-0.972248 19:-0.583507 20:-0.788328 21:-0.578826 22:0.476018 23:-0.99489 24:-0.805976 25:-0.604293 26:0.560736 27:-0.995555 28:-0.808693 29:-0.462423 30:0.706301 31:-0.989625 32:-0.898871 33:-0.459875 34:0.6805 35:-0.988518 36:-0.902107
+-1024 1:-0.795288 2:-0.86471 3:-0.704392 4:0.118957 5:-0.997807 6:-0.913224 7:-0.698521 8:0.23696 9:-0.997669 10:-0.915869 11:-0.62196 12:0.554778 13:-0.99764 14:-0.943664 15:-0.61557 16:0.557946 17:-0.997553 18:-0.945314 19:-0.386802 20:-0.671728 21:-0.334121 22:0.571199 23:-0.984316 24:-0.701182 25:-0.349229 26:0.643057 27:-0.983135 28:-0.707468 29:-0.146498 30:0.739352 31:-0.968563 32:-0.847948 33:-0.159237 34:0.646101 35:-0.961006 36:-0.859174
+-1024 1:-0.728221 2:-0.718844 3:-0.51345 4:0.223914 5:-0.985847 6:-0.830349 7:-0.487628 8:0.335051 9:-0.984872 10:-0.840375 11:-0.32776 12:0.47969 13:-0.976826 14:-0.909784 15:-0.337926 16:0.444899 17:-0.974284 18:-0.914168 19:-0.0627078 20:-0.521786 21:-0.148242 22:0.349318 23:-0.904077 24:-0.613318 25:-0.144226 26:0.34273 27:-0.891381 28:-0.649733 29:-0.100641 30:0.370189 31:-0.841746 32:-0.771579 33:-0.113377 34:0.225283 35:-0.81933 36:-0.797549
+-1024 1:-0.737743 2:-0.702716 3:-0.54881 4:0.0941084 5:-0.968057 6:-0.829132 7:-0.537168 8:0.211065 9:-0.968287 10:-0.833365 11:-0.527451 12:0.262801 13:-0.94485 14:-0.896666 15:-0.533193 16:0.189756 17:-0.937021 18:-0.907064 19:-0.0683277 20:-0.510141 21:-0.18824 22:0.252439 23:-0.873023 24:-0.615983 25:-0.184748 26:0.220469 27:-0.85328 28:-0.657618 29:-0.118473 30:0.307269 31:-0.815667 32:-0.762866 33:-0.12102 34:0.150374 35:-0.790197 36:-0.792565
+-1024 1:-0.740227 2:-0.704782 3:-0.554468 4:0.092509 5:-0.968234 6:-0.829871 7:-0.542829 8:0.204918 9:-0.967976 10:-0.83464 11:-0.538121 12:0.265129 13:-0.944803 14:-0.895784 15:-0.542346 16:0.1804 17:-0.936075 18:-0.907973 19:-0.079568 20:-0.515253 21:-0.197652 22:0.249861 23:-0.873454 24:-0.61882 25:-0.194284 26:0.230685 27:-0.856064 28:-0.655237 29:-0.131212 30:0.314986 31:-0.818557 32:-0.76331 33:-0.133759 34:0.147673 35:-0.79113 36:-0.794684
+1024 1:-0.795288 2:-0.865152 3:-0.678934 4:0.120106 5:-0.998548 6:-0.921535 7:-0.698521 8:0.250764 9:-0.998639 10:-0.91839 11:-0.585375 12:0.552085 13:-0.998379 14:-0.950141 15:-0.589636 16:0.564269 17:-0.998438 18:-0.950164 19:-0.290635 20:-0.621689 21:-0.197652 22:0.570406 23:-0.974367 24:-0.667831 25:-0.227657 26:0.651868 27:-0.976254 28:-0.681213 29:0.0547751 30:0.594419 31:-0.951316 32:-0.8625 33:0.0471317 34:0.598739 35:-0.951705 36:-0.861177
+1024 1:-0.735673 2:-0.371499 3:-0.486576 4:0.339715 5:-0.85403 6:-0.361923 7:-0.487628 8:0.170993 9:-0.823526 10:-0.535203 11:-0.405503 12:-0.138404 13:-0.746973 14:-0.773321 15:-0.417253 16:-0.113303 17:-0.744756 18:-0.76959 19:-0.0933061 20:-0.109491 21:-0.047065 22:0.282568 23:-0.602027 24:0.0284462 25:-0.0512578 26:-0.229358 27:-0.408094 28:-0.322194 29:0.0980851 30:-0.411715 31:-0.344425 32:-0.611911 33:0.0955373 34:-0.417416 35:-0.341822 36:-0.612479
+-1024 1:-0.733603 2:-0.414076 3:-0.459704 4:0.484389 5:-0.916127 6:-0.419005 7:-0.425352 8:0.441814 9:-0.915328 10:-0.573679 11:-0.280506 12:0.163679 13:-0.856872 14:-0.80255 15:-0.281482 16:0.196394 17:-0.859822 18:-0.80118 19:-0.0614589 20:-0.108981 21:0.00940594 22:0.408197 23:-0.662346 24:0.056772 25:0.0154888 26:-0.168506 27:-0.461433 28:-0.332853 29:0.13885 30:-0.373347 31:-0.368092 32:-0.611055 33:0.131206 34:-0.408258 35:-0.351197 36:-0.613643
+1024 1:-0.73857 2:-0.51211 3:-0.479504 4:0.604404 5:-0.974491 6:-0.544324 7:-0.422521 8:0.620765 9:-0.971842 10:-0.65169 11:-0.102156 12:0.581323 13:-0.95162 14:-0.837748 15:-0.106048 16:0.587825 17:-0.951538 18:-0.83837 19:-0.0339827 20:-0.172545 21:-0.00471179 22:0.429171 23:-0.715769 24:-0.0458392 25:-0.0202683 26:-0.0220905 27:-0.563867 28:-0.347094 29:0.108276 30:-0.303306 31:-0.454627 32:-0.654565 33:0.105728 34:-0.32191 35:-0.450004 36:-0.658797
+1024 1:-0.639628 2:-0.279116 3:-0.32958 4:0.468207 5:-0.849119 6:-0.270989 7:-0.323444 8:0.293009 9:-0.827165 10:-0.476109 11:-0.173801 12:-0.118246 13:-0.741382 14:-0.753603 15:-0.186901 16:-0.0597744 17:-0.746915 18:-0.746722 19:0.245743 20:-0.00403813 21:0.202346 22:0.36584 23:-0.584543 24:0.164162 25:0.225259 26:-0.206361 27:-0.396209 28:-0.259099 29:0.388531 30:-0.395338 31:-0.300255 32:-0.549861 33:0.383436 34:-0.411515 35:-0.292986 36:-0.551861
diff --git a/src/backend/app/algorithms/evaluate/libsvm/python/brisquequality.py b/src/backend/app/algorithms/evaluate/libsvm/python/brisquequality.py
new file mode 100644
index 0000000..f3f658b
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/python/brisquequality.py
@@ -0,0 +1,220 @@
+import cv2
+import numpy as np
+import math as m
+import sys
+# for gamma function, called
+from scipy.special import gamma as tgamma
+import os
+
+# import svm functions (from libsvm library)
+# if python2.x version : import svm from libsvm (sudo apt-get install python-libsvm)
+if sys.version_info[0] < 3:
+ import svm
+ import svmutil
+ from svmutil import *
+ from svm import *
+else:
+ # if python 3.x version
+ # make sure the file is in libsvm/python folder
+ import svm
+ import svmutil
+ from svm import *
+ from svmutil import *
+
+# AGGD fit model, takes input as the MSCN Image / Pair-wise Product
+def AGGDfit(structdis):
+ # variables to count positive pixels / negative pixels and their squared sum
+ poscount = 0
+ negcount = 0
+ possqsum = 0
+ negsqsum = 0
+ abssum = 0
+
+ poscount = len(structdis[structdis > 0]) # number of positive pixels
+ negcount = len(structdis[structdis < 0]) # number of negative pixels
+
+ # calculate squared sum of positive pixels and negative pixels
+ possqsum = np.sum(np.power(structdis[structdis > 0], 2))
+ negsqsum = np.sum(np.power(structdis[structdis < 0], 2))
+
+ # absolute squared sum
+ abssum = np.sum(structdis[structdis > 0]) + np.sum(-1 * structdis[structdis < 0])
+
+ # calculate left sigma variance and right sigma variance
+ lsigma_best = np.sqrt((negsqsum/negcount))
+ rsigma_best = np.sqrt((possqsum/poscount))
+
+ gammahat = lsigma_best/rsigma_best
+
+ # total number of pixels - totalcount
+ totalcount = structdis.shape[1] * structdis.shape[0]
+
+ rhat = m.pow(abssum/totalcount, 2)/((negsqsum + possqsum)/totalcount)
+ rhatnorm = rhat * (m.pow(gammahat, 3) + 1) * (gammahat + 1)/(m.pow(m.pow(gammahat, 2) + 1, 2))
+
+ prevgamma = 0
+ prevdiff = 1e10
+ sampling = 0.001
+ gam = 0.2
+
+ # vectorized function call for best fitting parameters
+ vectfunc = np.vectorize(func, otypes = [np.float32], cache = False)
+
+ # calculate best fit params
+ gamma_best = vectfunc(gam, prevgamma, prevdiff, sampling, rhatnorm)
+
+ return [lsigma_best, rsigma_best, gamma_best]
+
+def func(gam, prevgamma, prevdiff, sampling, rhatnorm):
+ while(gam < 10):
+ r_gam = tgamma(2/gam) * tgamma(2/gam) / (tgamma(1/gam) * tgamma(3/gam))
+ diff = abs(r_gam - rhatnorm)
+ if(diff > prevdiff): break
+ prevdiff = diff
+ prevgamma = gam
+ gam += sampling
+ gamma_best = prevgamma
+ return gamma_best
+
+def compute_features(img):
+ scalenum = 2
+ feat = []
+ # make a copy of the image
+ im_original = img.copy()
+
+ # scale the images twice
+ for itr_scale in range(scalenum):
+ im = im_original.copy()
+ # normalize the image
+ im = im / 255.0
+
+ # calculating MSCN coefficients
+ mu = cv2.GaussianBlur(im, (7, 7), 1.166)
+ mu_sq = mu * mu
+
+ sigma = cv2.GaussianBlur(im*im, (7, 7), 1.166)
+ sigma = (sigma - mu_sq) ** 0.5
+
+ # structdis is the MSCN image
+ structdis = im - mu
+ structdis /= (sigma + 1.0/255)
+
+ # calculate best fitted parameters from MSCN image
+ best_fit_params = AGGDfit(structdis)
+ # unwrap the best fit parameters
+ lsigma_best = best_fit_params[0]
+ rsigma_best = best_fit_params[1]
+ gamma_best = best_fit_params[2]
+
+ # append the best fit parameters for MSCN image
+ feat.append(gamma_best)
+ feat.append((lsigma_best*lsigma_best + rsigma_best*rsigma_best)/2)
+
+ # shifting indices for creating pair-wise products
+ shifts = [[0,1], [1,0], [1,1], [-1,1]] # H V D1 D2
+
+ for itr_shift in range(1, len(shifts) + 1):
+ OrigArr = structdis
+ reqshift = shifts[itr_shift-1] # shifting index
+
+ # create transformation matrix for warpAffine function
+ M = np.float32([[1, 0, reqshift[1]], [0, 1, reqshift[0]]])
+ ShiftArr = cv2.warpAffine(OrigArr, M, (structdis.shape[1], structdis.shape[0]))
+
+ Shifted_new_structdis = ShiftArr
+ Shifted_new_structdis = Shifted_new_structdis * structdis
+ # shifted_new_structdis is the pairwise product
+ # best fit the pairwise product
+ best_fit_params = AGGDfit(Shifted_new_structdis)
+ lsigma_best = best_fit_params[0]
+ rsigma_best = best_fit_params[1]
+ gamma_best = best_fit_params[2]
+
+ constant = m.pow(tgamma(1/gamma_best), 0.5)/m.pow(tgamma(3/gamma_best), 0.5)
+ meanparam = (rsigma_best - lsigma_best) * (tgamma(2/gamma_best)/tgamma(1/gamma_best)) * constant
+
+ # append the best fit calculated parameters
+ feat.append(gamma_best) # gamma best
+ feat.append(meanparam) # mean shape
+ feat.append(m.pow(lsigma_best, 2)) # left variance square
+ feat.append(m.pow(rsigma_best, 2)) # right variance square
+
+ # resize the image on next iteration
+ im_original = cv2.resize(im_original, (0,0), fx=0.5, fy=0.5, interpolation=cv2.INTER_CUBIC)
+ return feat
+
+# function to calculate BRISQUE quality score
+# takes input of the image path
+def test_measure_BRISQUE(imgPath):
+ # read image from given path
+ dis = cv2.imread(imgPath, 1)
+ if(dis is None):
+ print("Wrong image path given")
+ print("Exiting...")
+ sys.exit(0)
+ # convert to gray scale
+ dis = cv2.cvtColor(dis, cv2.COLOR_BGR2GRAY)
+
+ # compute feature vectors of the image
+ features = compute_features(dis)
+
+ # rescale the brisqueFeatures vector from -1 to 1
+ x = [0]
+
+ # pre loaded lists from C++ Module to rescale brisquefeatures vector to [-1, 1]
+ min_= [0.336999 ,0.019667 ,0.230000 ,-0.125959 ,0.000167 ,0.000616 ,0.231000 ,-0.125873 ,0.000165 ,0.000600 ,0.241000 ,-0.128814 ,0.000179 ,0.000386 ,0.243000 ,-0.133080 ,0.000182 ,0.000421 ,0.436998 ,0.016929 ,0.247000 ,-0.200231 ,0.000104 ,0.000834 ,0.257000 ,-0.200017 ,0.000112 ,0.000876 ,0.257000 ,-0.155072 ,0.000112 ,0.000356 ,0.258000 ,-0.154374 ,0.000117 ,0.000351]
+
+ max_= [9.999411, 0.807472, 1.644021, 0.202917, 0.712384, 0.468672, 1.644021, 0.169548, 0.713132, 0.467896, 1.553016, 0.101368, 0.687324, 0.533087, 1.554016, 0.101000, 0.689177, 0.533133, 3.639918, 0.800955, 1.096995, 0.175286, 0.755547, 0.399270, 1.095995, 0.155928, 0.751488, 0.402398, 1.041992, 0.093209, 0.623516, 0.532925, 1.042992, 0.093714, 0.621958, 0.534484]
+
+ # append the rescaled vector to x
+ for i in range(0, 36):
+ min = min_[i]
+ max = max_[i]
+ x.append(-1 + (2.0/(max - min) * (features[i] - min)))
+
+ # load model
+ model = svmutil.svm_load_model("allmodel")
+
+ # create svm node array from python list
+ x, idx = gen_svm_nodearray(x[1:], isKernel=(model.param.kernel_type == PRECOMPUTED))
+ x[36].index = -1 # set last index to -1 to indicate the end.
+
+ # get important parameters from model
+ svm_type = model.get_svm_type()
+ is_prob_model = model.is_probability_model()
+ nr_class = model.get_nr_class()
+
+ if svm_type in (ONE_CLASS, EPSILON_SVR, NU_SVC):
+ # here svm_type is EPSILON_SVR as it's regression problem
+ nr_classifier = 1
+ dec_values = (c_double * nr_classifier)()
+
+ # calculate the quality score of the image using the model and svm_node_array
+ qualityscore = svmutil.libsvm.svm_predict_probability(model, x, dec_values)
+
+ return qualityscore
+
+# exit if input argument not given
+if(len(sys.argv) != 3):
+ print("Please give input argument of the image path.")
+ print("Arguments expected: ")
+ print("--------------------------------")
+ print("Exiting")
+ sys.exit(0)
+
+# calculate quality score
+from PIL import Image
+qualityscore = 0.0
+cnt = 0
+data = sys.argv[1]
+for dir in os.listdir(data):
+ print(dir)
+ if 'png' in dir:
+ qualityscore += test_measure_BRISQUE(os.path.join(data, dir))
+ cnt += 1
+print("Averaged score of the given images: ", qualityscore/cnt)
+
+log_dir = sys.argv[2]
+
+with open(os.path.join("../..", os.path.join(log_dir, 'log.txt')), 'a') as f:
+ f.write(f'BRISQUE = {qualityscore/cnt}')
diff --git a/src/backend/app/algorithms/evaluate/libsvm/python/svm.py b/src/backend/app/algorithms/evaluate/libsvm/python/svm.py
new file mode 100644
index 0000000..577160d
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/python/svm.py
@@ -0,0 +1,330 @@
+#!/usr/bin/env python
+
+from ctypes import *
+from ctypes.util import find_library
+from os import path
+import sys
+
+if sys.version_info[0] >= 3:
+ xrange = range
+
+__all__ = ['libsvm', 'svm_problem', 'svm_parameter',
+ 'toPyModel', 'gen_svm_nodearray', 'print_null', 'svm_node', 'C_SVC',
+ 'EPSILON_SVR', 'LINEAR', 'NU_SVC', 'NU_SVR', 'ONE_CLASS',
+ 'POLY', 'PRECOMPUTED', 'PRINT_STRING_FUN', 'RBF',
+ 'SIGMOID', 'c_double', 'svm_model']
+
+try:
+ dirname = path.dirname(path.abspath(__file__))
+ if sys.platform == 'win32':
+ libsvm = CDLL(path.join(dirname, r'..\windows\libsvm.dll'))
+ else:
+ libsvm = CDLL(path.join(dirname, '../libsvm.so.2'))
+except:
+# For unix the prefix 'lib' is not considered.
+ if find_library('svm'):
+ libsvm = CDLL(find_library('svm'))
+ elif find_library('libsvm'):
+ libsvm = CDLL(find_library('libsvm'))
+ else:
+ raise Exception('LIBSVM library not found.')
+
+C_SVC = 0
+NU_SVC = 1
+ONE_CLASS = 2
+EPSILON_SVR = 3
+NU_SVR = 4
+
+LINEAR = 0
+POLY = 1
+RBF = 2
+SIGMOID = 3
+PRECOMPUTED = 4
+
+PRINT_STRING_FUN = CFUNCTYPE(None, c_char_p)
+def print_null(s):
+ return
+
+def genFields(names, types):
+ return list(zip(names, types))
+
+def fillprototype(f, restype, argtypes):
+ f.restype = restype
+ f.argtypes = argtypes
+
+class svm_node(Structure):
+ _names = ["index", "value"]
+ _types = [c_int, c_double]
+ _fields_ = genFields(_names, _types)
+
+ def __str__(self):
+ return '%d:%g' % (self.index, self.value)
+
+def gen_svm_nodearray(xi, feature_max=None, isKernel=None):
+ if isinstance(xi, dict):
+ index_range = xi.keys()
+ elif isinstance(xi, (list, tuple)):
+ if not isKernel:
+ xi = [0] + xi # idx should start from 1
+ index_range = range(len(xi))
+ else:
+ raise TypeError('xi should be a dictionary, list or tuple')
+
+ if feature_max:
+ assert(isinstance(feature_max, int))
+ index_range = filter(lambda j: j <= feature_max, index_range)
+ if not isKernel:
+ index_range = filter(lambda j:xi[j] != 0, index_range)
+
+ index_range = sorted(index_range)
+ ret = (svm_node * (len(index_range)+1))()
+ ret[-1].index = -1
+ for idx, j in enumerate(index_range):
+ ret[idx].index = j
+ ret[idx].value = xi[j]
+ max_idx = 0
+ if index_range:
+ max_idx = index_range[-1]
+ return ret, max_idx
+
+class svm_problem(Structure):
+ _names = ["l", "y", "x"]
+ _types = [c_int, POINTER(c_double), POINTER(POINTER(svm_node))]
+ _fields_ = genFields(_names, _types)
+
+ def __init__(self, y, x, isKernel=None):
+ if len(y) != len(x):
+ raise ValueError("len(y) != len(x)")
+ self.l = l = len(y)
+
+ max_idx = 0
+ x_space = self.x_space = []
+ for i, xi in enumerate(x):
+ tmp_xi, tmp_idx = gen_svm_nodearray(xi,isKernel=isKernel)
+ x_space += [tmp_xi]
+ max_idx = max(max_idx, tmp_idx)
+ self.n = max_idx
+
+ self.y = (c_double * l)()
+ for i, yi in enumerate(y): self.y[i] = yi
+
+ self.x = (POINTER(svm_node) * l)()
+ for i, xi in enumerate(self.x_space): self.x[i] = xi
+
+class svm_parameter(Structure):
+ _names = ["svm_type", "kernel_type", "degree", "gamma", "coef0",
+ "cache_size", "eps", "C", "nr_weight", "weight_label", "weight",
+ "nu", "p", "shrinking", "probability"]
+ _types = [c_int, c_int, c_int, c_double, c_double,
+ c_double, c_double, c_double, c_int, POINTER(c_int), POINTER(c_double),
+ c_double, c_double, c_int, c_int]
+ _fields_ = genFields(_names, _types)
+
+ def __init__(self, options = None):
+ if options == None:
+ options = ''
+ self.parse_options(options)
+
+ def __str__(self):
+ s = ''
+ attrs = svm_parameter._names + list(self.__dict__.keys())
+ values = map(lambda attr: getattr(self, attr), attrs)
+ for attr, val in zip(attrs, values):
+ s += (' %s: %s\n' % (attr, val))
+ s = s.strip()
+
+ return s
+
+ def set_to_default_values(self):
+ self.svm_type = C_SVC;
+ self.kernel_type = RBF
+ self.degree = 3
+ self.gamma = 0
+ self.coef0 = 0
+ self.nu = 0.5
+ self.cache_size = 100
+ self.C = 1
+ self.eps = 0.001
+ self.p = 0.1
+ self.shrinking = 1
+ self.probability = 0
+ self.nr_weight = 0
+ self.weight_label = None
+ self.weight = None
+ self.cross_validation = False
+ self.nr_fold = 0
+ self.print_func = cast(None, PRINT_STRING_FUN)
+
+ def parse_options(self, options):
+ if isinstance(options, list):
+ argv = options
+ elif isinstance(options, str):
+ argv = options.split()
+ else:
+ raise TypeError("arg 1 should be a list or a str.")
+ self.set_to_default_values()
+ self.print_func = cast(None, PRINT_STRING_FUN)
+ weight_label = []
+ weight = []
+
+ i = 0
+ while i < len(argv):
+ if argv[i] == "-s":
+ i = i + 1
+ self.svm_type = int(argv[i])
+ elif argv[i] == "-t":
+ i = i + 1
+ self.kernel_type = int(argv[i])
+ elif argv[i] == "-d":
+ i = i + 1
+ self.degree = int(argv[i])
+ elif argv[i] == "-g":
+ i = i + 1
+ self.gamma = float(argv[i])
+ elif argv[i] == "-r":
+ i = i + 1
+ self.coef0 = float(argv[i])
+ elif argv[i] == "-n":
+ i = i + 1
+ self.nu = float(argv[i])
+ elif argv[i] == "-m":
+ i = i + 1
+ self.cache_size = float(argv[i])
+ elif argv[i] == "-c":
+ i = i + 1
+ self.C = float(argv[i])
+ elif argv[i] == "-e":
+ i = i + 1
+ self.eps = float(argv[i])
+ elif argv[i] == "-p":
+ i = i + 1
+ self.p = float(argv[i])
+ elif argv[i] == "-h":
+ i = i + 1
+ self.shrinking = int(argv[i])
+ elif argv[i] == "-b":
+ i = i + 1
+ self.probability = int(argv[i])
+ elif argv[i] == "-q":
+ self.print_func = PRINT_STRING_FUN(print_null)
+ elif argv[i] == "-v":
+ i = i + 1
+ self.cross_validation = 1
+ self.nr_fold = int(argv[i])
+ if self.nr_fold < 2:
+ raise ValueError("n-fold cross validation: n must >= 2")
+ elif argv[i].startswith("-w"):
+ i = i + 1
+ self.nr_weight += 1
+ weight_label += [int(argv[i-1][2:])]
+ weight += [float(argv[i])]
+ else:
+ raise ValueError("Wrong options")
+ i += 1
+
+ libsvm.svm_set_print_string_function(self.print_func)
+ self.weight_label = (c_int*self.nr_weight)()
+ self.weight = (c_double*self.nr_weight)()
+ for i in range(self.nr_weight):
+ self.weight[i] = weight[i]
+ self.weight_label[i] = weight_label[i]
+
+class svm_model(Structure):
+ _names = ['param', 'nr_class', 'l', 'SV', 'sv_coef', 'rho',
+ 'probA', 'probB', 'sv_indices', 'label', 'nSV', 'free_sv']
+ _types = [svm_parameter, c_int, c_int, POINTER(POINTER(svm_node)),
+ POINTER(POINTER(c_double)), POINTER(c_double),
+ POINTER(c_double), POINTER(c_double), POINTER(c_int),
+ POINTER(c_int), POINTER(c_int), c_int]
+ _fields_ = genFields(_names, _types)
+
+ def __init__(self):
+ self.__createfrom__ = 'python'
+
+ def __del__(self):
+ # free memory created by C to avoid memory leak
+ if hasattr(self, '__createfrom__') and self.__createfrom__ == 'C':
+ libsvm.svm_free_and_destroy_model(pointer(self))
+
+ def get_svm_type(self):
+ return libsvm.svm_get_svm_type(self)
+
+ def get_nr_class(self):
+ return libsvm.svm_get_nr_class(self)
+
+ def get_svr_probability(self):
+ return libsvm.svm_get_svr_probability(self)
+
+ def get_labels(self):
+ nr_class = self.get_nr_class()
+ labels = (c_int * nr_class)()
+ libsvm.svm_get_labels(self, labels)
+ return labels[:nr_class]
+
+ def get_sv_indices(self):
+ total_sv = self.get_nr_sv()
+ sv_indices = (c_int * total_sv)()
+ libsvm.svm_get_sv_indices(self, sv_indices)
+ return sv_indices[:total_sv]
+
+ def get_nr_sv(self):
+ return libsvm.svm_get_nr_sv(self)
+
+ def is_probability_model(self):
+ return (libsvm.svm_check_probability_model(self) == 1)
+
+ def get_sv_coef(self):
+ return [tuple(self.sv_coef[j][i] for j in xrange(self.nr_class - 1))
+ for i in xrange(self.l)]
+
+ def get_SV(self):
+ result = []
+ for sparse_sv in self.SV[:self.l]:
+ row = dict()
+
+ i = 0
+ while True:
+ row[sparse_sv[i].index] = sparse_sv[i].value
+ if sparse_sv[i].index == -1:
+ break
+ i += 1
+
+ result.append(row)
+ return result
+
+def toPyModel(model_ptr):
+ """
+ toPyModel(model_ptr) -> svm_model
+
+ Convert a ctypes POINTER(svm_model) to a Python svm_model
+ """
+ if bool(model_ptr) == False:
+ raise ValueError("Null pointer")
+ m = model_ptr.contents
+ m.__createfrom__ = 'C'
+ return m
+
+fillprototype(libsvm.svm_train, POINTER(svm_model), [POINTER(svm_problem), POINTER(svm_parameter)])
+fillprototype(libsvm.svm_cross_validation, None, [POINTER(svm_problem), POINTER(svm_parameter), c_int, POINTER(c_double)])
+
+fillprototype(libsvm.svm_save_model, c_int, [c_char_p, POINTER(svm_model)])
+fillprototype(libsvm.svm_load_model, POINTER(svm_model), [c_char_p])
+
+fillprototype(libsvm.svm_get_svm_type, c_int, [POINTER(svm_model)])
+fillprototype(libsvm.svm_get_nr_class, c_int, [POINTER(svm_model)])
+fillprototype(libsvm.svm_get_labels, None, [POINTER(svm_model), POINTER(c_int)])
+fillprototype(libsvm.svm_get_sv_indices, None, [POINTER(svm_model), POINTER(c_int)])
+fillprototype(libsvm.svm_get_nr_sv, c_int, [POINTER(svm_model)])
+fillprototype(libsvm.svm_get_svr_probability, c_double, [POINTER(svm_model)])
+
+fillprototype(libsvm.svm_predict_values, c_double, [POINTER(svm_model), POINTER(svm_node), POINTER(c_double)])
+fillprototype(libsvm.svm_predict, c_double, [POINTER(svm_model), POINTER(svm_node)])
+fillprototype(libsvm.svm_predict_probability, c_double, [POINTER(svm_model), POINTER(svm_node), POINTER(c_double)])
+
+fillprototype(libsvm.svm_free_model_content, None, [POINTER(svm_model)])
+fillprototype(libsvm.svm_free_and_destroy_model, None, [POINTER(POINTER(svm_model))])
+fillprototype(libsvm.svm_destroy_param, None, [POINTER(svm_parameter)])
+
+fillprototype(libsvm.svm_check_parameter, c_char_p, [POINTER(svm_problem), POINTER(svm_parameter)])
+fillprototype(libsvm.svm_check_probability_model, c_int, [POINTER(svm_model)])
+fillprototype(libsvm.svm_set_print_string_function, None, [PRINT_STRING_FUN])
diff --git a/src/backend/app/algorithms/evaluate/libsvm/python/svmutil.py b/src/backend/app/algorithms/evaluate/libsvm/python/svmutil.py
new file mode 100644
index 0000000..d353010
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/python/svmutil.py
@@ -0,0 +1,262 @@
+#!/usr/bin/env python
+
+import os
+import sys
+from svm import *
+from svm import __all__ as svm_all
+
+
+__all__ = ['evaluations', 'svm_load_model', 'svm_predict', 'svm_read_problem',
+ 'svm_save_model', 'svm_train'] + svm_all
+
+sys.path = [os.path.dirname(os.path.abspath(__file__))] + sys.path
+
+def svm_read_problem(data_file_name):
+ """
+ svm_read_problem(data_file_name) -> [y, x]
+
+ Read LIBSVM-format data from data_file_name and return labels y
+ and data instances x.
+ """
+ prob_y = []
+ prob_x = []
+ for line in open(data_file_name):
+ line = line.split(None, 1)
+ # In case an instance with all zero features
+ if len(line) == 1: line += ['']
+ label, features = line
+ xi = {}
+ for e in features.split():
+ ind, val = e.split(":")
+ xi[int(ind)] = float(val)
+ prob_y += [float(label)]
+ prob_x += [xi]
+ return (prob_y, prob_x)
+
+def svm_load_model(model_file_name):
+ """
+ svm_load_model(model_file_name) -> model
+
+ Load a LIBSVM model from model_file_name and return.
+ """
+ model = libsvm.svm_load_model(model_file_name.encode())
+ if not model:
+ print("can't open model file %s" % model_file_name)
+ return None
+ model = toPyModel(model)
+ return model
+
+def svm_save_model(model_file_name, model):
+ """
+ svm_save_model(model_file_name, model) -> None
+
+ Save a LIBSVM model to the file model_file_name.
+ """
+ libsvm.svm_save_model(model_file_name.encode(), model)
+
+def evaluations(ty, pv):
+ """
+ evaluations(ty, pv) -> (ACC, MSE, SCC)
+
+ Calculate accuracy, mean squared error and squared correlation coefficient
+ using the true values (ty) and predicted values (pv).
+ """
+ if len(ty) != len(pv):
+ raise ValueError("len(ty) must equal to len(pv)")
+ total_correct = total_error = 0
+ sumv = sumy = sumvv = sumyy = sumvy = 0
+ for v, y in zip(pv, ty):
+ if y == v:
+ total_correct += 1
+ total_error += (v-y)*(v-y)
+ sumv += v
+ sumy += y
+ sumvv += v*v
+ sumyy += y*y
+ sumvy += v*y
+ l = len(ty)
+ ACC = 100.0*total_correct/l
+ MSE = total_error/l
+ try:
+ SCC = ((l*sumvy-sumv*sumy)*(l*sumvy-sumv*sumy))/((l*sumvv-sumv*sumv)*(l*sumyy-sumy*sumy))
+ except:
+ SCC = float('nan')
+ return (ACC, MSE, SCC)
+
+def svm_train(arg1, arg2=None, arg3=None):
+ """
+ svm_train(y, x [, options]) -> model | ACC | MSE
+ svm_train(prob [, options]) -> model | ACC | MSE
+ svm_train(prob, param) -> model | ACC| MSE
+
+ Train an SVM model from data (y, x) or an svm_problem prob using
+ 'options' or an svm_parameter param.
+ If '-v' is specified in 'options' (i.e., cross validation)
+ either accuracy (ACC) or mean-squared error (MSE) is returned.
+ options:
+ -s svm_type : set type of SVM (default 0)
+ 0 -- C-SVC (multi-class classification)
+ 1 -- nu-SVC (multi-class classification)
+ 2 -- one-class SVM
+ 3 -- epsilon-SVR (regression)
+ 4 -- nu-SVR (regression)
+ -t kernel_type : set type of kernel function (default 2)
+ 0 -- linear: u'*v
+ 1 -- polynomial: (gamma*u'*v + coef0)^degree
+ 2 -- radial basis function: exp(-gamma*|u-v|^2)
+ 3 -- sigmoid: tanh(gamma*u'*v + coef0)
+ 4 -- precomputed kernel (kernel values in training_set_file)
+ -d degree : set degree in kernel function (default 3)
+ -g gamma : set gamma in kernel function (default 1/num_features)
+ -r coef0 : set coef0 in kernel function (default 0)
+ -c cost : set the parameter C of C-SVC, epsilon-SVR, and nu-SVR (default 1)
+ -n nu : set the parameter nu of nu-SVC, one-class SVM, and nu-SVR (default 0.5)
+ -p epsilon : set the epsilon in loss function of epsilon-SVR (default 0.1)
+ -m cachesize : set cache memory size in MB (default 100)
+ -e epsilon : set tolerance of termination criterion (default 0.001)
+ -h shrinking : whether to use the shrinking heuristics, 0 or 1 (default 1)
+ -b probability_estimates : whether to train a SVC or SVR model for probability estimates, 0 or 1 (default 0)
+ -wi weight : set the parameter C of class i to weight*C, for C-SVC (default 1)
+ -v n: n-fold cross validation mode
+ -q : quiet mode (no outputs)
+ """
+ prob, param = None, None
+ if isinstance(arg1, (list, tuple)):
+ assert isinstance(arg2, (list, tuple))
+ y, x, options = arg1, arg2, arg3
+ param = svm_parameter(options)
+ prob = svm_problem(y, x, isKernel=(param.kernel_type == PRECOMPUTED))
+ elif isinstance(arg1, svm_problem):
+ prob = arg1
+ if isinstance(arg2, svm_parameter):
+ param = arg2
+ else:
+ param = svm_parameter(arg2)
+ if prob == None or param == None:
+ raise TypeError("Wrong types for the arguments")
+
+ if param.kernel_type == PRECOMPUTED:
+ for xi in prob.x_space:
+ idx, val = xi[0].index, xi[0].value
+ if xi[0].index != 0:
+ raise ValueError('Wrong input format: first column must be 0:sample_serial_number')
+ if val <= 0 or val > prob.n:
+ raise ValueError('Wrong input format: sample_serial_number out of range')
+
+ if param.gamma == 0 and prob.n > 0:
+ param.gamma = 1.0 / prob.n
+ libsvm.svm_set_print_string_function(param.print_func)
+ err_msg = libsvm.svm_check_parameter(prob, param)
+ if err_msg:
+ raise ValueError('Error: %s' % err_msg)
+
+ if param.cross_validation:
+ l, nr_fold = prob.l, param.nr_fold
+ target = (c_double * l)()
+ libsvm.svm_cross_validation(prob, param, nr_fold, target)
+ ACC, MSE, SCC = evaluations(prob.y[:l], target[:l])
+ if param.svm_type in [EPSILON_SVR, NU_SVR]:
+ print("Cross Validation Mean squared error = %g" % MSE)
+ print("Cross Validation Squared correlation coefficient = %g" % SCC)
+ return MSE
+ else:
+ print("Cross Validation Accuracy = %g%%" % ACC)
+ return ACC
+ else:
+ m = libsvm.svm_train(prob, param)
+ m = toPyModel(m)
+
+ # If prob is destroyed, data including SVs pointed by m can remain.
+ m.x_space = prob.x_space
+ return m
+
+def svm_predict(y, x, m, options=""):
+ """
+ svm_predict(y, x, m [, options]) -> (p_labels, p_acc, p_vals)
+
+ Predict data (y, x) with the SVM model m.
+ options:
+ -b probability_estimates: whether to predict probability estimates,
+ 0 or 1 (default 0); for one-class SVM only 0 is supported.
+ -q : quiet mode (no outputs).
+
+ The return tuple contains
+ p_labels: a list of predicted labels
+ p_acc: a tuple including accuracy (for classification), mean-squared
+ error, and squared correlation coefficient (for regression).
+ p_vals: a list of decision values or probability estimates (if '-b 1'
+ is specified). If k is the number of classes, for decision values,
+ each element includes results of predicting k(k-1)/2 binary-class
+ SVMs. For probabilities, each element contains k values indicating
+ the probability that the testing instance is in each class.
+ Note that the order of classes here is the same as 'model.label'
+ field in the model structure.
+ """
+
+ def info(s):
+ print(s)
+
+ predict_probability = 0
+ argv = options.split()
+ i = 0
+ while i < len(argv):
+ if argv[i] == '-b':
+ i += 1
+ predict_probability = int(argv[i])
+ elif argv[i] == '-q':
+ info = print_null
+ else:
+ raise ValueError("Wrong options")
+ i+=1
+
+ svm_type = m.get_svm_type()
+ is_prob_model = m.is_probability_model()
+ nr_class = m.get_nr_class()
+ pred_labels = []
+ pred_values = []
+
+ if predict_probability:
+ if not is_prob_model:
+ raise ValueError("Model does not support probabiliy estimates")
+
+ if svm_type in [NU_SVR, EPSILON_SVR]:
+ info("Prob. model for test data: target value = predicted value + z,\n"
+ "z: Laplace distribution e^(-|z|/sigma)/(2sigma),sigma=%g" % m.get_svr_probability());
+ nr_class = 0
+
+ prob_estimates = (c_double * nr_class)()
+ for xi in x:
+ xi, idx = gen_svm_nodearray(xi, isKernel=(m.param.kernel_type == PRECOMPUTED))
+ label = libsvm.svm_predict_probability(m, xi, prob_estimates)
+ values = prob_estimates[:nr_class]
+ pred_labels += [label]
+ pred_values += [values]
+ else:
+ if is_prob_model:
+ info("Model supports probability estimates, but disabled in predicton.")
+ if svm_type in (ONE_CLASS, EPSILON_SVR, NU_SVC):
+ nr_classifier = 1
+ else:
+ nr_classifier = nr_class*(nr_class-1)//2
+ dec_values = (c_double * nr_classifier)()
+ for xi in x:
+ xi, idx = gen_svm_nodearray(xi, isKernel=(m.param.kernel_type == PRECOMPUTED))
+ label = libsvm.svm_predict_values(m, xi, dec_values)
+ if(nr_class == 1):
+ values = [1]
+ else:
+ values = dec_values[:nr_classifier]
+ pred_labels += [label]
+ pred_values += [values]
+
+ ACC, MSE, SCC = evaluations(y, pred_labels)
+ l = len(y)
+ if svm_type in [EPSILON_SVR, NU_SVR]:
+ info("Mean squared error = %g (regression)" % MSE)
+ info("Squared correlation coefficient = %g (regression)" % SCC)
+ else:
+ info("Accuracy = %g%% (%d/%d) (classification)" % (ACC, int(l*ACC/100), l))
+
+ return pred_labels, (ACC, MSE, SCC), pred_values
+
+
diff --git a/src/backend/app/algorithms/evaluate/libsvm/svm-predict b/src/backend/app/algorithms/evaluate/libsvm/svm-predict
new file mode 100644
index 0000000..7c78120
Binary files /dev/null and b/src/backend/app/algorithms/evaluate/libsvm/svm-predict differ
diff --git a/src/backend/app/algorithms/evaluate/libsvm/svm-predict.c b/src/backend/app/algorithms/evaluate/libsvm/svm-predict.c
new file mode 100644
index 0000000..859c9ff
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/svm-predict.c
@@ -0,0 +1,239 @@
+#include
+#include
+#include
+#include
+#include
+#include "svm.h"
+
+int print_null(const char *s,...) {return 0;}
+
+static int (*info)(const char *fmt,...) = &printf;
+
+struct svm_node *x;
+int max_nr_attr = 64;
+
+struct svm_model* model;
+int predict_probability=0;
+
+static char *line = NULL;
+static int max_line_len;
+
+static char* readline(FILE *input)
+{
+ int len;
+
+ if(fgets(line,max_line_len,input) == NULL)
+ return NULL;
+
+ while(strrchr(line,'\n') == NULL)
+ {
+ max_line_len *= 2;
+ line = (char *) realloc(line,max_line_len);
+ len = (int) strlen(line);
+ if(fgets(line+len,max_line_len-len,input) == NULL)
+ break;
+ }
+ return line;
+}
+
+void exit_input_error(int line_num)
+{
+ fprintf(stderr,"Wrong input format at line %d\n", line_num);
+ exit(1);
+}
+
+void predict(FILE *input, FILE *output)
+{
+ int correct = 0;
+ int total = 0;
+ double error = 0;
+ double sump = 0, sumt = 0, sumpp = 0, sumtt = 0, sumpt = 0;
+
+ int svm_type=svm_get_svm_type(model);
+ int nr_class=svm_get_nr_class(model);
+ double *prob_estimates=NULL;
+ int j;
+
+ if(predict_probability)
+ {
+ if (svm_type==NU_SVR || svm_type==EPSILON_SVR)
+ info("Prob. model for test data: target value = predicted value + z,\nz: Laplace distribution e^(-|z|/sigma)/(2sigma),sigma=%g\n",svm_get_svr_probability(model));
+ else
+ {
+ int *labels=(int *) malloc(nr_class*sizeof(int));
+ svm_get_labels(model,labels);
+ prob_estimates = (double *) malloc(nr_class*sizeof(double));
+ fprintf(output,"labels");
+ for(j=0;j start from 0
+
+ label = strtok(line," \t\n");
+ if(label == NULL) // empty line
+ exit_input_error(total+1);
+
+ target_label = strtod(label,&endptr);
+ if(endptr == label || *endptr != '\0')
+ exit_input_error(total+1);
+
+ while(1)
+ {
+ if(i>=max_nr_attr-1) // need one more for index = -1
+ {
+ max_nr_attr *= 2;
+ x = (struct svm_node *) realloc(x,max_nr_attr*sizeof(struct svm_node));
+ }
+
+ idx = strtok(NULL,":");
+ val = strtok(NULL," \t");
+
+ if(val == NULL)
+ break;
+ errno = 0;
+ x[i].index = (int) strtol(idx,&endptr,10);
+ if(endptr == idx || errno != 0 || *endptr != '\0' || x[i].index <= inst_max_index)
+ exit_input_error(total+1);
+ else
+ inst_max_index = x[i].index;
+
+ errno = 0;
+ x[i].value = strtod(val,&endptr);
+ if(endptr == val || errno != 0 || (*endptr != '\0' && !isspace(*endptr)))
+ exit_input_error(total+1);
+
+ ++i;
+ }
+ x[i].index = -1;
+
+ if (predict_probability && (svm_type==C_SVC || svm_type==NU_SVC))
+ {
+ predict_label = svm_predict_probability(model,x,prob_estimates);
+ fprintf(output,"%g",predict_label);
+ for(j=0;j=argc-2)
+ exit_with_help();
+
+ input = fopen(argv[i],"r");
+ if(input == NULL)
+ {
+ fprintf(stderr,"can't open input file %s\n",argv[i]);
+ exit(1);
+ }
+
+ output = fopen(argv[i+2],"w");
+ if(output == NULL)
+ {
+ fprintf(stderr,"can't open output file %s\n",argv[i+2]);
+ exit(1);
+ }
+
+ if((model=svm_load_model(argv[i+1]))==0)
+ {
+ fprintf(stderr,"can't open model file %s\n",argv[i+1]);
+ exit(1);
+ }
+
+ x = (struct svm_node *) malloc(max_nr_attr*sizeof(struct svm_node));
+ if(predict_probability)
+ {
+ if(svm_check_probability_model(model)==0)
+ {
+ fprintf(stderr,"Model does not support probabiliy estimates\n");
+ exit(1);
+ }
+ }
+ else
+ {
+ if(svm_check_probability_model(model)!=0)
+ info("Model supports probability estimates, but disabled in prediction.\n");
+ }
+
+ predict(input,output);
+ svm_free_and_destroy_model(&model);
+ free(x);
+ free(line);
+ fclose(input);
+ fclose(output);
+ return 0;
+}
diff --git a/src/backend/app/algorithms/evaluate/libsvm/svm-scale b/src/backend/app/algorithms/evaluate/libsvm/svm-scale
new file mode 100644
index 0000000..659c5f5
Binary files /dev/null and b/src/backend/app/algorithms/evaluate/libsvm/svm-scale differ
diff --git a/src/backend/app/algorithms/evaluate/libsvm/svm-scale.c b/src/backend/app/algorithms/evaluate/libsvm/svm-scale.c
new file mode 100644
index 0000000..197537b
--- /dev/null
+++ b/src/backend/app/algorithms/evaluate/libsvm/svm-scale.c
@@ -0,0 +1,397 @@
+#include
+#include