Translated

tech/20221228 AI- Anaconda and More on Probability.md
This commit is contained in:
toknow-gh 2023-12-18 19:59:03 +08:00
parent 61f09c4da7
commit d86bab8e45
2 changed files with 127 additions and 127 deletions

View File

@ -1,127 +0,0 @@
[#]: subject: "AI: Anaconda and More on Probability"
[#]: via: "https://www.opensourceforu.com/2022/12/ai-anaconda-and-more-on-probability/"
[#]: author: "Deepu Benson https://www.opensourceforu.com/author/deepu-benson/"
[#]: collector: "lujun9972/lctt-scripts-1700446145"
[#]: translator: "toknow-gh"
[#]: reviewer: " "
[#]: publisher: " "
[#]: url: " "
AI: Anaconda and More on Probability
======
[![AI Anaconda][1]][2]
_In the previous article in this series on AI and machine learning, we began by exploring the intricacies of using TensorFlow, a very powerful library used for developing AI and machine learning applications. Then, we discussed probability and paved the way for many of our future discussions. In this article, the fifth in the series, we will continue exploring concepts in probability and statistics._
To begin with, let us first install Anaconda, a distribution of Python for scientific computing especially useful for developing AI, machine learning and data science based applications. Later, we will also introduce a Python library called Theano in the context of probability theory. But before doing that, let us discuss the future of AI a little bit.
While going through the earlier articles in this series for evaluation and course correction, I wondered whether my tone was a bit sceptical at times while discussing the future of AI. By being brutally honest about some of the topics we discussed, I may have inadvertently demotivated a tiny fraction of the readers and potential practitioners of AI.
This made me research the financial aspects of AI and machine learning. I wanted to identify the kind of companies involved in the AI market. Are the bigwigs heavily involved or are there just some startups trying to move away from the pack? Another important question was: how much money will be pumped into the AI market by these companies in the near future? Is it going to be just a couple of million, or a few billion or maybe even a few trillion?
For the predictions and data stated here, I depended on articles published in respected newspapers in the recent past rather than on scholarly articles which I couldnt grasp well to understand the complex dynamics behind the development of an AI based economy. An article published by Forbes in had 2020 predicted that companies would spend 50 billion dollars on AI in financial year 2020. Well, that is a really huge investment, even if we acknowledge that the American billion (109) is just one-thousandth of the British billion (1012). An article published in Fortune, another American business magazine, reported that venture capitalists are shifting some of their focus from artificial intelligence to newer and trendier fields like Web3 and decentralised finance (DeFi). But The Wall Street Journal (an American business-focused newspaper) in 2022 confidently predicted that, Big Tech Is Spending Billions on AI Research. Investors Should Keep an Eye Out.
What about the Indian scenario? Business Standard reported in 2022 that 87 per cent of the Indian companies will hike their AI expenditure by 10 per cent in the next three years. So, overall, it looks like the future of AI is very safe and bright. But which are the top companies investing in AI? As expected, all the giants like Amazon, Meta (Facebook), Alphabet (Google), Microsoft, IBM, etc, are doing so. But, surprisingly, companies like Shell, Johnson & Johnson, Unilever, Walmart, etc, whose primary focus is neither computing nor information technology, are also in the list of big companies heavily investing in AI. And yes! Amazon is a tech company and not an e-commerce company.
So it is clear that many of the giant companies in the world believe that AI is going to play a prominent role in the near future. But what changes and new trends are these leaders expecting in the future? Again, to find some answers, I depended on news articles and interviews, rather than on scholarly articles. The terms frequently mentioned in context with future trends in AI include responsible AI, quantum AI, IoT with AI, AI and ethics, automated machine learning, etc. I believe these are vast topics and we will discuss some of these terms (except AI and ethics, which we already discussed in the last article) in detail, in the coming articles in this series.
### An introduction to Anaconda
Let us now move to discussing the necessary tech for AI. From this article onwards, we will start using Anaconda whenever necessary. Anaconda is a distribution of the Python and R programming languages for scientific computing. It helps a lot by simplifying package management.
Now, let us install Anaconda. First, go to the web page <https://www.anaconda.com/products/distribution#linux> and download the latest version of the Anaconda distribution installer. Open a terminal in the directory to which the distribution installer is downloaded. As of writing this article (October 2022), the latest Anaconda distribution installer for systems with 64-bit processors is Anaconda3-2022.05-Linux-x86_64.sh. In order to check the integrity of the installer, run the following command on the terminal:
```
shasum -a 256 Anaconda3-2022.05-Linux-x86_64.sh
```
If you have downloaded a different version of the installer, then use the name of that version after the command shasum -a 256. You will see a hash value displayed on the terminal, followed by the name of the installer. In my case, the line displayed was:
```
a7c0afe862f6ea19a 596801fc138bde0463 abcbce1b753e8d5c474b506a2db2d Anaconda3-2022.05-Linux-x86_64.sh
```
Now, go to the web page <https://docs.anaconda.com/anaconda/install/hashes> and go to the hash value of the corresponding version of the installer you have downloaded. Make sure that you continue with the installation of Anaconda only if the hash value displayed on the terminal and the hash value given in the web page are exactly the same. If the hash values do not match, then the installer downloaded is corrupted. In this case, restart the installation process once again. Now, if the hash values match, execute the following command on the terminal to begin the installation:
```
bash Anaconda3-2022.05-Linux-x86_64.sh
```
But in case you have downloaded a different installer, then use the appropriate name after the command bash. Now, press Enter, scroll down to go through the agreement and accept it. Finally, type yes to start the installation. Whenever a prompt appears, stick to the default option provided by Anaconda (unless you have a very good reason not to do so). Now, the Anaconda installation is finished.
Anaconda installation by default also installs Conda, a package manager and environment management system. Wikipedia says that the Anaconda distribution comes with over 250 packages automatically installed, with the option of installing over 7500 additional open source packages. The most important thing to remember is that whatever package or library you have installed by using Anaconda can be used in your Jupyter Notebook also. Whatever updation is required by other packages or libraries during the installation of a new package will be automatically handled by Anaconda carefully, without bothering you in the least.
So, finally, we have reached a place in our journey where we no longer need to worry about installing the new packages and libraries necessary to continue our quest for developing AI and machine learning based applications. Notice that Anaconda only has a command-line interface (CLI). Are we in trouble? No! Our installation also gives us Anaconda Navigator, a graphical user interface (GUI) for Anaconda. Execute the command anaconda-navigator on the terminal to run Anaconda Navigator (Figure 1). Soon, we will see an example of its power.
![Figure 1: Anaconda Navigator][3]
### Introduction to Theano
Now, let us try to work with Theano, a Python library and optimising compiler for evaluating mathematical expressions. The installation of Theano is very easy since we have Anaconda Navigator. First, open Anaconda Navigator. On the top right corner you will see the button Environments (marked with a red box in Figure 1). Press that button. You will go to a window showing the list of all currently installed packages. From the drop-down list on top, choose the option Not installed. Scroll down, find the option Theano, and click on the checkbox on the left. Now press the green button named Apply at the bottom right corner of the window. Anaconda will find out all the dependencies for installing Theano and show them in a pop-up menu. Figure 2 shows the pop-up menu I got during the installation of Theano in my system. You can see that in addition to Theano, one more new package is installed and eight other packages are modified.
Imagine how difficult this would have been if I was trying to manually install Theano. But with the help of Anaconda, all we need to do is press the button Apply on the bottom right corner of this pop-up menu. After a while, the installation of Theano will be completed smoothly. Now, we are ready to use Theano in our Jupyter Notebooks.
We are already familiar with the Python library called SymPy used for symbolic calculation but Theano takes it to the next level. Let us see a simple example to understand the power of Theano. Consider the code shown in Figure 3. First, let us try to understand the code. Line 1 of the code imports Theano. Line 2 imports tensor and names it T. We have already encountered tensors when we discussed TensorFlow.
![Figure 2: Installation of Theano][4]
![Figure 3: Our first code using Theano][5]
Mathematically, tensors can be treated as multi-dimensional arrays. Tensor is one of the key data structures used in Theano and can be used to store and manipulate scalars (numbers), vectors (1-dimensional arrays), matrices (2-dimensional arrays), tensors (multi-dimensional arrays), etc. In Line 3, a Theano function called function( ) is imported. Line 4 imports a Theano function called pp( ), which is used for pretty-printing (printing in a format appealing to humans). Line 5 creates a symbolic scalar variable of type double called x. It is a bit tricky to understand these sorts of symbolic variables. Think of it as a 0-sized double variable with no value or storage associated. Similarly, Line 6 creates another symbolic scalar variable called y. Line 7 acts like a function in a sense. This line tells the Python interpreter something like the following, if and when symbolic scalar variables x and y get some values, add those values and store inside me.
To further illustrate symbolic operations at this point, let us try to understand Line 8.
![Figure 4: Manipulating matrices with Theano][6]
From Figure 3, it is clear that the output of this line of code is (x+y). So, it is clear that actual addition of two numbers is yet to take place. Lines 9 to 11 similarly define symbolic subtraction, multiplication, and division, respectively. If you want further clarity, use the function pp( ) to find the values of b, c, and d. Line 12 is very critical. It defines a function named f( ) using the function function( ) of Theano. The function function( ) takes two parameters. Parameter 1 is the data to be operated upon and parameter 2 is the function to be applied on the data given as parameter 1. Here, the data is the two symbolic scalar variables x and y. The function is given by [a b c d] which represents symbolic addition, subtraction, multiplication, and division, respectively. Finally, in Line 13, the function f( ) is provided with actual values rather than symbolic ones. The output of this operation is also shown in Figure 3. It is very easy to verify that the output shown is correct.
Now, let us see how matrices can be created and manipulated using Theano. Consider the code shown in Figure 4. It is important to note that the first three lines of code shown in Figure 3 should be added here also, if you are writing this program independent of the programs we have discussed so far. Now, let us try to understand the code. Line 1 of the code creates two symbolic matrices x and y. Think of x and y as two 0-dimensional arrays. But this time these matrices are of type integer unlike the last time where the data type was double. Further, this time a plural constructor (imatrices) is used so that more than one matrix can be constructed at the same time. Lines 3 to 5 perform symbolic addition, subtraction, and multiplication, respectively, on symbolic matrices x and y. Here again you can use print(pp(a)), print(pp(b)), and print(pp(c)) to better understand the symbolic nature of the operations being performed. If you add these lines of code, you will get (x+y), (x-y), and (x \dot y), respectively, as output. Line 5 generates a function f( ) which takes two parameters as before. Again, parameter 1 is the data to be operated upon and parameter 2 is the function to be applied on the data given as parameter 1. But this time, the data is the two symbolic matrices x and y. The function is given by [a b c] which represents symbolic addition, subtraction, and multiplication, respectively. Finally, in Line 6, the function f( ) is provided with actual values rather than symbolic ones. The two matrices given as input to f( ) are [[1, 2], [3, 4]] and [[5, 6], [7, 8]]. The output of this operation is also shown in Figure 4. It is very easy to verify that the three output matrices shown are correct. Notice that, in addition to scalars and matrices (for both of which we saw examples now), tensor also offers constructors like vector, row, column, different types of tensors, etc. Let us stop discussing Theano for now and revisit it while discussing advanced topics in probability and statistics.
![Figure 5: Arithmetic mean and standard deviation][7]
### Baby steps with probability
Now, let us continue discussing probability and statistics. I had suggested in the last article (while introducing probability) that you carefully go through three Wikipedia articles, and with that assumption of familiarity I tried to motivate you by discussing the Normal distribution. However, we must revisit some of the basic notions of probability and statistics before we start developing AI and machine learning based applications. I hope all of you have heard about arithmetic mean and standard deviation (SD).
Arithmetic mean can be thought of as the average of a set of values. SD can be thought of as the variation or dispersion of a set of values. If the SD value is low then the elements in the set tend to be closer to the mean. On the contrary, if the SD value is high then the elements of the set are spread out over a wider range. But how can we calculate arithmetic mean and SD using Python? There is a module called statistics available with Python which can be used to find mean and standard deviation. However, experts are of the opinion that this module is very slow and hence we choose NumPy. Now, consider the code shown in Figure 5.
The code prints the mean and standard deviation of two lists C1 and C2 (whose values are hidden from you for the time being because of obvious reasons). What interpretations can you make from these values? Nothing! They are just numbers for you right now. But what if I tell you the lists contain the marks of six students, studying in 10th standard in school A and school B, for an exam in mathematics (the exam is out of 50 with a pass mark of 20). The mean value tells us that students from both the schools have relatively poor average marks with school B slightly outperforming school A. But what does the standard deviation value tell us? To further your understanding, I will give you the two lists, C1 = [20, 22, 20, 22, 22, 20] and C2 = [18, 16, 17, 16, 15, 48]. Though hidden by the mean, the large standard deviation value of 11.813363431112899 clearly captures the mass failure in school B. This example clearly tells us that we need more complicated parameters to understand the complex nature of the problems we deal with. Probability and statistics will come to our rescue here by providing more and more complex models which can imitate complex and chaotic data.
Random number generation is an essential part of probability. But, in practice, only pseudorandom number generation (a sequence of numbers whose properties approximate the properties of sequences of random numbers) is possible. Now, let us see a few functions that will help us generate pseudorandom numbers. Consider the code shown in Figure 6. First, let us try to understand the code. Line 1 imports the random package of Python. In Line 2, the function random.random( ) generates random numbers. This line of code, new_list = [random.random() for i in range(2)], uses a technique called list comprehension to generate two random numbers and store them in the list named new_list. Line 3 prints this list as output. Notice that the two random numbers printed change with each iteration of the code, and the probability of getting the same numbers printed twice consecutively is theoretically zero. The single line of code in the second cell shown in Figure 6 uses the function random.choice( ). This function makes an equally likely choice from all the choices given to it. In this case, the code fragment random.choice([“Heads”, “Tails”]) will make a choice between Heads or Tails with equal probability. Notice that this line of code also uses the technique called list comprehension so that three successive choices between Heads or Tails are made. Figure 6 shows the output of this code, where the option Tails is chosen for three consecutive times.
![Figure 6: Pseudo random number generation][8]
Now, let us try to illustrate a very popular theorem in probability, the law of large numbers (LLN), with a simple example. LLN states that the average of the results obtained from a large number of trials should be close to the expected value. Further, this average tends to come closer and closer to the expected value as more and more trials are performed. We all know that if a fair dice is thrown, the probability of getting the number 6 is 1/6. We now simulate this experiment with a simple Python code shown in Figure 7. Line 1 imports the random package of Python. Line 2 sets the number of trials to be performed. In this case, it is 1000. Line 3 initialises the counter ct to zero. Line 4 sets a loop, which iterates 1000 times in this case. Line 5 uses the function random.randint(1, 6). In this case, the function generates an integer between 1 and 6 (inclusive of both 1 and 6) randomly. The if statement in Line 5 checks whether the number generated is equal to 6; if yes the control goes to Line 7 and increases the counter ct by 1. After the loop is iterated a 1000 times, Line 8 will be executed. Line 8 prints the ratio between the number of occurrences of the number 6 and the total number of trials (1000). Figure 7 also shows this number as 0.179 (slightly more than the expected value 1/6 = 0.1666…which is also printed in the output). Not that close to the expected value, right? In Line 2, set the value of n as 10000. Run the code again and observe the new output printed. Most probably you will get a number which is a little bit closer to the expected value. Notice that this could also be a number less than the expected value. Increase the value of n, in Line 2, for a few more times. You will observe that the output is inching closer and closer to the expected value. Thus, with a simple code, we have illustrated LLN.
![Figure 7: Illustrating the law of large numbers][9]
Though a simple statement, you will be amazed to know that the list of mathematicians who worked on a proof for LLN or tried to refine one include Cardano, Jacob Bernoulli, Daniel Bernoulli, Poisson, Chebyshev, Markov, Borel, Cantelli, Kolmogorov, Khinchin, etc. All of them are mathematical giants in their own field.
We are yet to cover topics in probability like random variables, probability distributions, etc, which are essential in developing AI and machine learning based applications. Our discussion of topics in probability and statistics is still in the early stages and we will try to strengthen our knowledge of these subjects in the next article also. At the same time, we will revisit two of our old friends we met in this journey earlier, the libraries Pandas and TensorFlow. We will also befriend a relative of TensorFlow called Keras, a library which acts as an interface for TensorFlow.
--------------------------------------------------------------------------------
via: https://www.opensourceforu.com/2022/12/ai-anaconda-and-more-on-probability/
作者:[Deepu Benson][a]
选题:[lujun9972][b]
译者:[译者ID](https://github.com/译者ID)
校对:[校对者ID](https://github.com/校对者ID)
本文由 [LCTT](https://github.com/LCTT/TranslateProject) 原创编译,[Linux中国](https://linux.cn/) 荣誉推出
[a]: https://www.opensourceforu.com/author/deepu-benson/
[b]: https://github.com/lujun9972
[1]: https://www.opensourceforu.com/wp-content/uploads/2022/11/AI-Anaconda-programming-696x477.jpg (AI-Anaconda-programming)
[2]: https://www.opensourceforu.com/wp-content/uploads/2022/11/AI-Anaconda-programming.jpg
[3]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-1-Anaconda-Navigator-.png
[4]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-2-Installation-of-Theano.png
[5]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-3-Our-first-code-using-Theano-590x394.png
[6]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-4-Manipulating-matrices-with-Theano.png
[7]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-5-Arithmetic-mean-and-standard-deviation-590x164.png
[8]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-6-Pseudo-random-number-generation-590x168.png
[9]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-7-Illustrating-the-law-of-large-numbers-590x266.png

View File

@ -0,0 +1,127 @@
[#]: subject: "AI: Anaconda and More on Probability"
[#]: via: "https://www.opensourceforu.com/2022/12/ai-anaconda-and-more-on-probability/"
[#]: author: "Deepu Benson https://www.opensourceforu.com/author/deepu-benson/"
[#]: collector: "lujun9972/lctt-scripts-1700446145"
[#]: translator: "toknow-gh"
[#]: reviewer: " "
[#]: publisher: " "
[#]: url: " "
人工智能教程Anaconda 以及更多概率论
======
[![AI Anaconda][1]][2]
_在本系列的[前一篇文章](https://linux.cn/article-16485-1.html)中,我们首先介绍了使用 TensorFlow。它是一个非常强大的开发人工智能和机器学习应用程序的库。然后我们讨论了概率论的相关知识为我们后面的讨论打下基础。在本系列的第五篇文章中我们将继续介绍概率和统计中的概念。_
在本文中我将首先介绍 Anaconda一个用于科学计算的 Python 发行版。它对于开发人工智能、机器学习和数据科学的程序特别有用。稍后我们将介绍一个名为 Theano 的 Python 库。但在此之前,让我们下讨论一下人工智能的未来。
在回顾和修订之前的文章时,我发觉我偶尔对人工智能前景的怀疑语气和在一些话题上毫不留情的诚实态度可能在无意中使部分读者产生了消极情绪。
这促使我开始从金融角度研究人工智能和机器学习。我想确定涉足人工智能市场的公司类型。是否有重量级的公司大力参与其中?还是只有一些初创公司在努力推动?这些公司未来会向人工智能市场投入多少资金?是几百万美元,几十亿美元还是几万亿美元?
我通过于最近知名报纸上的的预测和数据来理解基于人工智能的经济发展背后的复杂动态性。2020 年《福布斯》上的一篇文章就预测 2020 年企业在人工智能上投入的投入将达到 500 亿美元的规模。这是一笔巨大的投资。《财富》杂志上发表的一篇文章称,风险投资者正将部分关注力从人工智能转移到 Web3 和<ruby>去中心化金融<rt>decentralised finance</rt></ruby>DeFi等更新潮的领域上。但《华尔街日报》在 2022 年自信地预测,“大型科技公司正在花费数十亿美元进行人工智能研究。投资者应该密切关注。”
印度《商业标准报》在 2022 年报道称87% 的印度公司将在未来 3 年将人工智能支出提高 10%。总的来说,人工智能的未来看起来是非常安全和光明的。 令人惊讶的是除了亚马逊、MetaFacebook 的母公司、Alphabet谷歌的母公司、微软、IBM 等顶级科技巨头在投资人工智能外,壳牌、强生、联合利华、沃尔玛等非 IT 科技类公司也在大举投资人工智能。
很明显众多世界级大公司都认为人工智能将在不久的将来发挥重要作用。但是未来的变化和新趋势是什么呢?我通过新闻文章和采访找到一些答案。在人工智能未来趋势的背景下,经常提到的术语包括<ruby>负责任的人工智能<rt>responsible AI</rt></ruby>、量子人工智能、人工智能物联网、人工智能和伦理、自动机器学习等。我相信这些都是需要深入探讨的话题,在上一篇文章中我们已经讨论过人工智能和伦理,在后续的文章中我们将详细讨论一些其它的话题。
### Anaconda 入门
现在让我们讨论人工智能的必要技术。Anaconda 是用于科学计算的 Python 和 R 语言的发行版。它极大地简化了包管理过程。从本文开始,我们将在有需要时使用 Anaconda。第一步让我们安装 Anaconda。访问[安装程序下载页面](https://www.anaconda.com/products/distribution#linux)下载最新版本的 Anaconda 发行版安装程序。在撰写本文时2022 年 10 月64 位处理器上最新的 Anaconda 安装程序是 Anaconda3-2022.05-Linux-x86_64.sh。如果你下载了不同版本的安装程序将后面命令中的文件名换成你实际下载的安装文件名就行。下载完成后需要检查安装程序的完整性。在安装程序目录中打开一个终端运行以下命令
```
shasum -a 256 Anaconda3-2022.05-Linux-x86_64.sh
```
终端上会输出哈希值和文件名。我的输出显示是:
```
a7c0afe862f6ea19a596801fc138bde0463abcbce1b753e8d5c474b506a2db2d Anaconda3-2022.05-Linux-x86_64.sh
```
然后访问[Anaconda 安装程序哈希值页面](https://docs.anaconda.com/anaconda/install/hashes),比对下载安装文件的哈希值。如果哈希值匹配,说明下载文件完整无误,否则请重新下载。然后在终端上执行以下命令开始安装:
```
bash Anaconda3-2022.05-Linux-x86_64.sh
```
按 Enter 键后向下滚动查看并接受用户协议。最后输入“yes”开始安装。出现用户交互提示时一般直接使用 Anaconda 的默认选项就行。现在 Anaconda 就安装完成了。
默认情况下Anaconda 会安装 Conda。这是一个包管理器和环境管理系统。Anaconda 发行版会自动安装超过 250 个软件包,并可选择安装超过 7500 个额外的开源软件包。而且使用 Anaconda 安装的任何包或库都可以在 Jupyter Notebook 中使用。在安装新包的过程中, Anaconda 会自动处理它的依赖项的更新。
至此之后我们终于不用再担心安装软件包和库的问题了可以继续我们的人工智能和机器学习程序的开发。注意Anaconda 只有一个命令行界面。好在我们的安装项中包括 Anaconda Navigator。这是一个用于 Anaconda 的图形用户界面。在终端上执行命令 `anaconda-navigator` 运行 Anaconda Navigator图 1。我们马上会通过例子看到它的强大功能。
![图 1Anaconda Navigator][3]
### Theano 介绍
Theano 是一个用于数学表达式计算的优化编译的 Python 库。在 Anaconda Navigator 中安装Theano 非常容易。打开 Anaconda Navigator 后点击 Environments 按钮(图 1 中用红框标记。在打开的窗口中会显示当前安装的所有软件包的列表。在顶部的下拉列表中选择“Not installed”选项。向下滚动并找到 Theano然后勾选左侧的复选框。点击窗口右下角的绿色 Apply 按钮。Anaconda 会在弹出菜单中显示安装 Theano 的所有依赖项。图 2 是我安装 Theano 时的弹出菜单。可以看到,除了 Theano 之外,还安装了一个新的包,并修改了 8 个包。
想象一下,如果要手动安装 Theano这将是多么麻烦。有了 Anaconda我们只需要点几个按钮就行了。只需要等待一会儿Theano 就安装好了。现在我们可以在 Jupyter Notebook 中使用 Theano 了。
![图 2安装 Theano][4]
我们已经熟悉了用于符号计算的 Python 库 SymPy但 Theano 将符号计算提升到了一个新的水平。图 3 是一个使用 Theano 的例子。第 1 行代码导入 Theano。第 2 行导入 `theano.tensor` 并将其命名为 `T`。我们在介绍 TensorFlow 时已经介绍过<ruby>张量<rt>tensor</rt></ruby>了。
![图 3使用 Theano 的代码例子][5]
在数学上,可以将张量看作多维数组。张量是 Theano 的关键数据结构之一,它可用于存储和操作标量(数字)、向量(一维数组)、矩阵(二维数组)、张量(多维数组)等。在第 3 行中,从 Theano 导入了 `function()` 的函数。第 4 行导入名为 `pp()` 的 Theano 函数,该函数用于格式化打印。第 5 行创建了一个名为 `x``double` 类型的标量符号变量。你可能会在理解符号变量这个概念上遇到一些困难。这里你可以把它看作是没有绑定具体值的 `double` 类型的对象。类似地,第 6 行创建了另一个名为 `y` 的标量符号变量。第 7 行告诉 Python 解释器,当符号变量 `x``y` 得到值时,将这些值相加并存储在 `a` 里面。
为了进一步解释符号操作,仔细看第 8 行的输出是 `(x+y)`。这表明两个数字的实际相加还没有发生。第 9 到 11 行类似地分别定义了符号减法、乘法和除法。你可以自己使用函数 `pp()` 来查找 `b`、`c` 和 `d` 的值。第 12 行非常关键。它使用 Theano 的 `function()` 函数定义了一个名为 `f()` 的新函数。 函数 `f()` 的输入是 `x``y`,输出是 `[a b c d]`。最后在第 13 行中,给函数 `f()` 提供了实际值来调用该函数。该操作的输出也显示在图 3 中。我们很容易验证所显示的输出是正确的。
![图 4用 Theano 处理矩阵][6]
下面让我们通过图 4 的代码来看看如何使用 Theano 创建和操作矩阵。需要注意的是,图中我省略了导入代码。如果你要直接运行图 4 的代码,需要自己添加上这几行导入代码(图 3 中的前三行)。第 1 行创建了两个符号矩阵 `x``y`。这里我使用了<ruby>复数构造函数<rt>plural constructor</rt></ruby> `imatrices`,它可以同时构造多个矩阵。第 2 行到第 4 行分别对符号矩阵 `x``y` 执行符号加法、减法和乘法。这里你可以使用 `print(pp(a))`、`print(pp(b))` 和 `print(pp(c))` 来帮助理解符号操作的性质。第 5 行创建了一个函数 `f()`,它的输入是两个符号矩阵 `x``y`,输出是 `[a b c]`,它们分别表示符号加法、减法和乘法。最后,在第 6 行中,为函数 `f()` 提供实际的值来调用该函数。该操作的输出也显示在图 4 中。很容易验证所示的三个输出矩阵是否正确。注意除了标量和矩阵张量还提供了向量、行、列类型张量的构造函数。Theano 暂时就介绍到这里了,在讨论概率和统计的进阶话题时我们还会提到它。
### 再来一点概率论
![图 5算术平均值和标准偏差][7]
现在我们继续讨论概率论和统计。我在上一篇文章中我建议你仔细阅读三篇维基百科文章,然后介绍了正态分布。在我们开始开发人工智能和机器学习程序之前,有必要回顾一些概率论和统计的基本概念。我们首先要介绍的是<ruby>算术平均值<rt>arithmetic mean</rt></ruby><ruby>标准差<rt>standard deviation</rt></ruby>
算术平均值可以看作是一组数的平均值。标准差可以被认为是一组数的分散程度。如果标准差较小,则表示集合中的元素都接近平均值。相反,如果标准差很大,则表示集合的中的元素分布在较大的范围内。如何使用 Python 计算算术平均值和标准差呢Python 中有一个名为 statistics 的模块,可用于求平均值和标准差。但专家用户认为这个模块太慢,因此我们选择 NumPy。
图 5 所示的代码打印两个列表 `C1``C2` 的平均值和标准差(我暂时隐藏了两个列表的实际内容)。你能从这些值中看出什么呢?目前它们对你来说只是一些数字而已。现在我告诉你,这些列表分别包含学校 A 和学校 B 的 6 名学生的数学考试成绩(满分 50 分,及格 20 分)。均值告诉我们,两所学校的学生平均成绩都较差,但学校 B 的成绩略好于学校 A。标准差值告诉我们什么呢学校 B 的巨大的标准差值虽然隐藏在平均值之下,但却清楚地反映了学校 B 的的教学失败。为了进一步加深理解,我将给出两个列表的值,`C1 =[20,22,20,22,22,20] ``C2 =[18,16,17,16,15,48]`。这个例子清楚地告诉我们,我们需要更复杂的参数来处理问题的复杂性。概率和统计将提供更复杂的模型来描述复杂和混乱的数据。
随机数生成是概率论的重要组成部分。但实际上我们只能生成伪随机数。伪随机数序列具有和真随机数序列近似的性质。在图 6 中我们介绍了几个生成伪随机数的函数。第 1 行导入 Python 的 `random` 包。第 2 行代码生成两个随机数,并将它们存储在名为 `new_list` 的列表中。其中函数 `random.random()` 生成随机数,代码 `new_list = [random.random() for i in range(2)]` 使用了 Python 的<ruby>列表推导<rt>list comprehension</rt></ruby>语法。第 3 行将此列表打印输出。注意,每次执行代码打印出的两个随机数会变化,并且连续两次打印出相同数字的概率理论上为 0。图 6 的第二个代码单元中使用了 `random.choice()` 函数。这个函数从给定的选项中等概率地选择数据。代码片 `random.choice(["Heads", "Tails"])` 将等概率地在“Heads”和“Tails”之间选择。注意该行代码也使用了列表推导它会连续执行 3 次选择操作。从图 6 的输出可以看到三次都选中了“Tails”。
![图 6伪随机数生成][8]
现在,我们用一个简单的例子来说明概率论中著名的<ruby>大数定理<rt>xxx</rt></ruby>。大数定理表明从大量试验中获得的结果的平均值应该接近期望值,并且随着试验次数的增加这个平均值会越来越接近期望值。我们都知道,投掷一个均匀的骰子得到数字 6 的概率是 1/6。我们用图 7 中的 Python 代码来模拟这个实验。第 1 行导入 Python 的 `random` 包。第 2 行设置重复试验的次数为 1000。第 3 行将计数器 `ct` 初始化为 0。第 4 行是一个循环,它将迭代 1000 次。第 5 行的 `random.randint(1, 6)` 随机生成 1 到 6 之间的整数(包括 1 和 6。然后检查生成的数字是否等于 6如果是则转到第 7 行,将计数器 `ct` 增加 1。循环迭代 1000 次后,第 8 行打印数字 6 出现的次数与总试验次数之间的比例。图 7 显示该比例为 0.179,略高于期望值 1/6 = 0.1666…。这与期望值的差异还是比较大的。将第 2 行中 `n` 的值设置为 10000再次运行代码并观察打印的输出。很可能你会得到一个更接近期望值的数字它也可能是一个小于期望值的数字。不断增加第 2 行中 `n` 的值,你将看到输出越来越接近期望值。
![图 7大数定理][9]
虽然大数定理的描述朴实简单,但如果你了解到哪些数学家证明了大数定理或改进了原有的证明,你一定会大吃一惊的。他们包括卡尔达诺、雅各布·伯努利、丹尼尔·伯努利、泊松、切比雪夫、马尔科夫、博雷尔、坎特利、科尔莫戈罗夫、钦钦等。这些都是各自领域的数学巨匠。
目前我们还没有涵盖概率的随机变量、概率分布等主题它们对开发人工智能和机器学习程序是必不可少的。我们对概率和统计的讨论仍处于初级阶段在下一篇文章中还会加强这些知识。与此同时我们将重逢两个老朋友Pandas 和 TensorFlow。另外我们还将介绍一个与 TensorFlow 关系密切的库 Keras。
--------------------------------------------------------------------------------
via: https://www.opensourceforu.com/2022/12/ai-anaconda-and-more-on-probability/
作者:[Deepu Benson][a]
选题:[lujun9972][b]
译者:[toknow-gh](https://github.com/toknow-gh)
校对:[校对者ID](https://github.com/校对者ID)
本文由 [LCTT](https://github.com/LCTT/TranslateProject) 原创编译,[Linux中国](https://linux.cn/) 荣誉推出
[a]: https://www.opensourceforu.com/author/deepu-benson/
[b]: https://github.com/lujun9972
[1]: https://www.opensourceforu.com/wp-content/uploads/2022/11/AI-Anaconda-programming-696x477.jpg (AI-Anaconda-programming)
[2]: https://www.opensourceforu.com/wp-content/uploads/2022/11/AI-Anaconda-programming.jpg
[3]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-1-Anaconda-Navigator-.png
[4]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-2-Installation-of-Theano.png
[5]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-3-Our-first-code-using-Theano-590x394.png
[6]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-4-Manipulating-matrices-with-Theano.png
[7]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-5-Arithmetic-mean-and-standard-deviation-590x164.png
[8]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-6-Pseudo-random-number-generation-590x168.png
[9]: https://www.opensourceforu.com/wp-content/uploads/2022/11/Figure-7-Illustrating-the-law-of-large-numbers-590x266.png