国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看

合肥生活安徽新聞合肥交通合肥房產生活服務合肥教育合肥招聘合肥旅游文化藝術合肥美食合肥地圖合肥社保合肥醫院企業服務合肥法律

代寫COMP9444、代做Python語言程序

時間:2024-07-03  來源:合肥網hfw.cc  作者:hfw.cc 我要糾錯



COMP9444 Neural Networks and Deep Learning
Term 2, 2024
Assignment - Characters and Hidden Unit Dynamics
Due: Tuesday 2 July, 23:59 pm
Marks: 20% of final assessment
In this assignment, you will be implementing and training neural network models for three different tasks, and analysing the results. You are to submit two Python files kuzu.py
and check.py, as well as a written report hw1.pdf (in pdf format).
Provided Files
Copy the archive hw1.zip into your own filespace and unzip it. This should create a directory hw1, subdirectories net and plot, and eight Python files kuzu.py, check.py,
kuzu_main.py, check_main.py, seq_train.py, seq_models.py, seq_plot.py and anb2n.py.
Your task is to complete the skeleton files kuzu.py and check.py and submit them, along with your report.
Part 1: Japanese Character Recognition
For Part 1 of the assignment you will be implementing networks to recognize handwritten Hiragana symbols. The dataset to be used is Kuzushiji-MNIST or KMNIST for short.
The paper describing the dataset is available here. It is worth reading, but in short: significant changes occurred to the language when Japan reformed their education system in
1868, and the majority of Japanese today cannot read texts published over 150 years ago. This paper presents a dataset of handwritten, labeled examples of this old-style script
(Kuzushiji). Along with this dataset, however, they also provide a much simpler one, containing 10 Hiragana characters with 7000 samples per class. This is the dataset we will be
using.
Text from 1772 (left) compared to 1**0 showing the standardization of written Japanese.
1. [1 mark] Implement a model NetLin which computes a linear function of the pixels in the image, followed by log softmax. Run the code by typing:
python3 kuzu_main.py --net lin
Copy the final accuracy and confusion matrix into your report. The final accuracy should be around 70%. Note that the rows of the confusion matrix indicate the target
character, while the columns indicate the one chosen by the network. (0="o", 1="ki", 2="su", 3="tsu", 4="na", 5="ha", 6="ma", 7="ya", 8="re", 9="wo"). More examples
of each character can be found here.
2. [1 mark] Implement a fully connected 2-layer network NetFull (i.e. one hidden layer, plus the output layer), using tanh at the hidden nodes and log softmax at the output
node. Run the code by typing:
python3 kuzu_main.py --net full
Try different values (multiples of 10) for the number of hidden nodes and try to determine a value that achieves high accuracy (at least 84%) on the test set. Copy the final
accuracy and confusion matrix into your report, and include a calculation of the total number of independent parameters in the network.
3. [2 marks] Implement a convolutional network called NetConv, with two convolutional layers plus one fully connected layer, all using relu activation function, followed by
the output layer, using log softmax. You are free to choose for yourself the number and size of the filters, metaparameter values (learning rate and momentum), and whether
to use max pooling or a fully convolutional architecture. Run the code by typing:
python3 kuzu_main.py --net conv
Your network should consistently achieve at least 93% accuracy on the test set after 10 training epochs. Copy the final accuracy and confusion matrix into your report, and
include a calculation of the total number of independent parameters in the network.
4. [4 marks] Briefly discuss the following points:
a. the relative accuracy of the three models,
b. the number of independent parameters in each of the three models,
c. the confusion matrix for each model: which characters are most likely to be mistaken for which other characters, and why?
Part 2: Multi-Layer Perceptron
In Part 2 you will be exploring 2-layer neural networks (either trained, or designed by hand) to classify the following data:
1. [1 mark] Train a 2-layer neural network with either 5 or 6 hidden nodes, using sigmoid activation at both the hidden and output layer, on the above data, by typing:
python3 check_main.py --act sig --hid 6
You may need to run the code a few times, until it achieves accuracy of 100%. If the network appears to be stuck in a local minimum, you can terminate the process with
?ctrl?-C and start again. You are free to adjust the learning rate and the number of hidden nodes, if you wish (see code for details). The code should produce images in the
plot subdirectory graphing the function computed by each hidden node (hid_6_?.jpg) and the network as a whole (out_6.jpg). Copy these images into your report.
2. [2 marks] Design by hand a 2-layer neural network with 4 hidden nodes, using the Heaviside (step) activation function at both the hidden and output layer, which correctly
classifies the above data. Include a diagram of the network in your report, clearly showing the value of all the weights and biases. Write the equations for the dividing line
determined by each hidden node. Create a table showing the activations of all the hidden nodes and the output node, for each of the 9 training items, and include it in your
report. You can check that your weights are correct by entering them in the part of check.py where it says "Enter Weights Here", and typing:
python3 check_main.py --act step --hid 4 --set_weights
3. [1 mark] Now rescale your hand-crafted weights and biases from Part 2 by multiplying all of them by a large (fixed) number (for example, 10) so that the combination of
rescaling followed by sigmoid will mimic the effect of the step function. With these re-scaled weights and biases, the data should be correctly classified by the sigmoid
network as well as the step function network. Verify that this is true by typing:
python3 check_main.py --act sig --hid 4 --set_weights
Once again, the code should produce images in the plot subdirectory showing the function computed by each hidden node (hid_4_?.jpg) and the network as a whole
(out_4.jpg). Copy these images into your report, and be ready to submit check.py with the (rescaled) weights as part of your assignment submission.
Part 3: Hidden Unit Dynamics for Recurrent Networks
In Part 3 you will be investigating the hidden unit dynamics of recurrent networks trained on language prediction tasks, using the supplied code seq_train.py and seq_plot.py.
1. [2 marks] Train a Simple Recurrent Network (SRN) on the Reber Grammar prediction task by typing
python3 seq_train.py --lang reber
This SRN has 7 inputs, 2 hidden units and 7 outputs. The trained networks are stored every 10000 epochs, in the net subdirectory. After the training finishes, plot the
hidden unit activations at epoch 50000 by typing
python3 seq_plot.py --lang reber --epoch 50
The dots should be arranged in discernable clusters by color. If they are not, run the code again until the training is successful. The hidden unit activations are printed
according to their "state", using the colormap "jet":
Based on this colormap, annotate your figure (either electronically, or with a pen on a printout) by drawing a circle around the cluster of points corresponding to each state
in the state machine, and drawing arrows between the states, with each arrow labeled with its corresponding symbol. Include the annotated figure in your report.
2. [1 mark] Train an SRN on the anbn language prediction task by typing
python3 seq_train.py --lang anbn
The anbn language is a concatenation of a random number of A's followed by an equal number of B's. The SRN has 2 inputs, 2 hidden units and 2 outputs.
Look at the predicted probabilities of A and B as the training progresses. The first B in each sequence and all A's after the first A are not deterministic and can only be
predicted in a probabilistic sense. But, if the training is successful, all other symbols should be correctly predicted. In particular, the network should predict the last B in
each sequence as well as the subsequent A. The error should be consistently in the range of 0.01 to 0.03. If the network appears to have learned the task successfully, you
can stop it at any time using ?cntrl?-c. If it appears to be stuck in a local minimum, you can stop it and run the code again until it is successful.
After the training finishes, plot the hidden unit activations by typing
python3 seq_plot.py --lang anbn --epoch 100
Include the resulting figure in your report. The states are again printed according to the colormap "jet". Note, however, that these "states" are not unique but are instead used
to count either the number of A's we have seen or the number of B's we are still expecting to see.
Briefly explain how the anbn prediction task is achieved by the network, based on the generated figure. Specifically, you should describe how the hidden unit activations
change as the string is processed, and how it is able to correctly predict the last B in each sequence as well as the following A.
3. [2 marks] Train an SRN on the anbncn language prediction task by typing
python3 seq_train.py --lang anbncn
The SRN now has 3 inputs, 3 hidden units and 3 outputs. Again, the "state" is used to count up the A's and count down the B's and C's. Continue training (and re-start, if
necessary) for 200k epochs, or until the network is able to reliably predict all the C's as well as the subsequent A, and the error is consistently in the range of 0.01 to 0.03.
After the training finishes, plot the hidden unit activations at epoch 200000 by typing
python3 seq_plot.py --lang anbncn --epoch 200
(you can choose a different epoch number, if you wish). This should produce three images labeled anbncn_srn3_??.jpg, and also display an interactive 3D figure. Try to
rotate the figure in 3 dimensions to get one or more good view(s) of the points in hidden unit space, save them, and include them in your report. (If you can't get the 3D
figure to work on your machine, you can use the images anbncn_srn3_??.jpg)
Briefly explain how the anbncn prediction task is achieved by the network, based on the generated figure. Specifically, you should describe how the hidden unit activations
change as the string is processed, and how it is able to correctly predict the last B in each sequence as well as all of the C's and the following A.
4. [3 marks] This question is intended to be more challenging. Train an LSTM network to predict the Embedded Reber Grammar, by typing
python3 seq_train.py --lang reber --embed True --model lstm --hid 4
You can adjust the number of hidden nodes if you wish. Once the training is successful, try to analyse the behavior of the LSTM and explain how the task is accomplished
(this might involve modifying the code so that it returns and prints out the context units as well as the hidden units).
Submission
You should submit by typing
give cs9444 hw1 kuzu.py check.py hw1.pdf
You can submit as many times as you like    later submissions will overwrite earlier ones. You can check that your submission has been received by using the following
command:
9444 classrun -check hw1
The submission deadline is Tuesday 2 July, 23:59pm. In accordance with UNSW-wide policies, 5% penalty will be applied for every 24 hours late after the deadline, up to a
maximum of 5 days, after which submissions will not be accepted.
Additional information may be found in the FAQ and will be considered as part of the specification for the project. You should check this page regularly.
Plagiarism Policy
Group submissions will not be allowed for this assignment. Your code and report must be entirely your own work. Plagiarism detection software will be used to compare all
submissions pairwise (including submissions for similar assignments from previous offering, if appropriate) and serious penalties will be applied, particularly in the case of repeat
offences.
DO NOT COPY FROM OTHERS; DO NOT ALLOW ANYONE TO SEE YOUR CODE
Please refer to the UNSW Policy on Academic Integrity and Plagiarism if you require further clarification on this matter.
Good luck!
請加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp














 

掃一掃在手機打開當前頁
  • 上一篇:菲律賓帕西格離馬尼拉多遠?帕西格是一個怎樣的城市?
  • 下一篇:菲律賓大使館簽證中心電話(大使館可以辦理的業務)
  • 無相關信息
    合肥生活資訊

    合肥圖文信息
    流體仿真外包多少錢_專業CFD分析代做_友商科技CAE仿真
    流體仿真外包多少錢_專業CFD分析代做_友商科
    CAE仿真分析代做公司 CFD流體仿真服務 管路流場仿真外包
    CAE仿真分析代做公司 CFD流體仿真服務 管路
    流體CFD仿真分析_代做咨詢服務_Fluent 仿真技術服務
    流體CFD仿真分析_代做咨詢服務_Fluent 仿真
    結構仿真分析服務_CAE代做咨詢外包_剛強度疲勞振動
    結構仿真分析服務_CAE代做咨詢外包_剛強度疲
    流體cfd仿真分析服務 7類仿真分析代做服務40個行業
    流體cfd仿真分析服務 7類仿真分析代做服務4
    超全面的拼多多電商運營技巧,多多開團助手,多多出評軟件徽y1698861
    超全面的拼多多電商運營技巧,多多開團助手
    CAE有限元仿真分析團隊,2026仿真代做咨詢服務平臺
    CAE有限元仿真分析團隊,2026仿真代做咨詢服
    釘釘簽到打卡位置修改神器,2026怎么修改定位在范圍內
    釘釘簽到打卡位置修改神器,2026怎么修改定
  • 短信驗證碼 寵物飼養 十大衛浴品牌排行 suno 豆包網頁版入口 wps 目錄網 排行網

    關于我們 | 打賞支持 | 廣告服務 | 聯系我們 | 網站地圖 | 免責聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網 版權所有
    ICP備06013414號-3 公安備 42010502001045

    国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看
    免费不卡av在线| 日韩av日韩在线观看| 真实国产乱子伦对白视频| 欧美日韩精品一区| 国产国语videosex另类| 伊人天天久久大香线蕉av色| 国产一区二区在线免费| 久久久久久久久久久久久国产精品 | 久久精品视频免费播放| 日韩中文字幕一区| 成人在线小视频| 中文字幕色一区二区| 国产尤物av一区二区三区| 国产精品麻豆va在线播放| 欧美激情 国产精品| 国产精品无码人妻一区二区在线| 国产欧美一区二区视频| 不卡av电影院| 霍思燕三级露全乳照| 色婷婷综合久久久久| 色狠狠久久av五月综合| 91精品视频一区| 亚洲一区二区三区久久| 成人av.网址在线网站| 一本色道久久综合亚洲二区三区| 福利视频一二区| 亚洲综合色av| 国产精品一区二区电影| 亚洲熟妇无码一区二区三区| 91美女片黄在线观| 日韩资源av在线| 日韩有码在线播放| 激情网站五月天| 欧美成人性色生活仑片| 国产精品羞羞答答| 亚洲7777| 国产激情在线观看视频| 日韩精品在线视频免费观看| 久久精品在线视频| 毛片一区二区三区四区| 九九久久综合网站| 91久久久亚洲精品| 亚洲www在线| 久久精品国产精品亚洲精品色| 日韩视频专区| 久久夜色精品国产欧美乱| 国产欧美日韩综合精品| 亚洲精品高清视频| 日韩专区中文字幕| 国产男女激情视频| 色噜噜狠狠色综合网| 久久视频在线看| 成人做爽爽免费视频| 欧美一区二区激情| 国产精品区一区| 成人精品一区二区三区电影黑人 | 久久婷婷开心| 欧美性久久久久| 欧美激情极品视频| 九色自拍视频在线观看| 精品无码av无码免费专区| 亚洲第一精品区| 国产精品免费网站| 91国产丝袜在线放| 国内精品视频在线| 日韩**中文字幕毛片| 国产精品久久电影观看| 久久亚洲国产成人精品无码区| 日韩精品一区二区三区四| 美女黄色丝袜一区| 久久久久久国产精品一区| 国产一级不卡毛片| 日韩中文字幕亚洲精品欧美| 不卡av电影院| 日韩在线视频免费观看| 国产另类第一区| 欧洲熟妇精品视频| 一区二区免费在线观看| 国产精品无码专区在线观看| 国产精品69久久久久| 精品一区久久久| 日本高清不卡一区二区三| 欧美极品欧美精品欧美视频| 久久久久久久久久久久久久久久久久av| 国产一区二区视频播放| 日韩久久久久久久| 丁香五月网久久综合| 精品国产综合区久久久久久 | 国产成人精品在线播放| 99超碰麻豆| 国产视频精品网| 男人亚洲天堂网| 日韩理论片在线观看| 午夜在线视频免费观看| 欧美激情在线视频二区| 国产精品精品久久久| 久久久之久亚州精品露出| 超碰97国产在线| 国产欧美精品一区二区三区| 国内成+人亚洲| 欧美专区国产专区| 午夜精品久久久久久久99热 | 动漫一区二区在线| 中文字幕一区二区三区四区五区人| 久久精品人人做人人爽| 久久久久久久97| 国产福利一区视频| 99爱精品视频| 国产午夜伦鲁鲁| 国模视频一区二区三区| 欧美日韩精品中文字幕一区二区| 日本高清不卡三区| 日本电影一区二区三区| 亚洲高清在线观看一区| 亚洲精品国产精品久久| 91久久久久久久久久久| 成人av在线播放观看| 日韩精品大片| 欧美精品成人91久久久久久久| 国产对白在线播放| youjizz.com亚洲| 国产三级精品网站| 国产中文字幕免费观看| 精品日本一区二区三区在线观看| 青青草视频国产| 欧美一区二区中文字幕| 精品人妻少妇一区二区| 蜜桃传媒一区二区| 精品视频一区二区在线| 国产女人水真多18毛片18精品| 国产在线播放91| 国产精品伊人日日| 99久久免费观看| 91精品免费看| 久久综合九色综合网站| 国产av熟女一区二区三区| 色噜噜狠狠狠综合曰曰曰 | 一区二区三区的久久的视频| 中文字幕久精品免| 天天在线免费视频| 日韩免费一区二区三区| 欧美精品尤物在线| 国产人妻人伦精品| 91免费国产网站| 日韩在线中文视频| 精品久久久久久乱码天堂| 亚洲乱码一区二区三区三上悠亚| 日本精品免费一区二区三区| 激情综合网婷婷| 国产免费黄色一级片| 97色伦亚洲国产| 视频在线一区二区| 国产精品视频xxxx| 毛片精品免费在线观看| 午夜精品久久久久久久男人的天堂| 青青青青草视频| 国产日本欧美视频| 国产白丝袜美女久久久久| 国产精品久久久久久av| 亚洲欧洲日本国产| 日韩亚洲在线视频| 国产一区二区四区| 久久久亚洲天堂| 国产精品国产精品| 日日橹狠狠爱欧美超碰| 精品视频高清无人区区二区三区| 97精品国产91久久久久久| 色综合久久精品亚洲国产 | 伊人久久大香线蕉成人综合网| 亚洲一区二区三区免费观看| 欧美一区亚洲二区| 99久热re在线精品视频| 国产成人三级视频| 中文字幕中文字幕在线中一区高清| 日本一本a高清免费不卡| 国产在线观看福利| 国产成一区二区| 欧美激情国产日韩精品一区18| 日韩美女免费观看| 国产亚洲精品久久久久久久| 国产成人精品a视频一区www| 精品国产区在线| 欧美专区国产专区| 91精品天堂| 麻豆国产va免费精品高清在线| 日本韩国在线不卡| 波多野结衣综合网| 国产精品加勒比| 热久久99这里有精品| 91久久久久久久久久| 久久777国产线看观看精品| 青草热久免费精品视频| 成人国产一区二区| 国产精品国产一区二区| 日韩在线视频在线观看| 国产日本欧美在线观看| 久久久精品电影| 日韩国产在线一区| 久久久人成影片一区二区三区 | 国产毛片久久久久久国产毛片|