国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看

合肥生活安徽新聞合肥交通合肥房產生活服務合肥教育合肥招聘合肥旅游文化藝術合肥美食合肥地圖合肥社保合肥醫院企業服務合肥法律

代寫3007_7059 Artificial Intelligence 3007_7059
代寫3007_7059 Artificial Intelligence 3007_7059

時間:2024-09-08  來源:合肥網hfw.cc  作者:hfw.cc 我要糾錯


Assignment 2: Artificial Intelligence (3007_7059 Combined)

Assignment 2

The dataset is available here

(https://myuni.adelaide.edu.au/courses/95211/files/1453***/download)

Part 1 Wine Quality Prediction with 1NN (K-d Tree)

Wine experts evaluate the quality of wine based on sensory data. We could also collect the features of wine from objective tests, thus the objective features could be used to predict the expert’s judgment, which is the quality rating of the wine. This could be formed as a supervised learning problem with the objective features as the data features and wine quality rating as the data labels.

In this assignment, we provide objective features obtained from physicochemical statistics for each white wine sample and its corresponding rating provided by wine experts. You are expected to implement the k-d tree (KDT) and use the training set to train your k-d tree, then provide wine quality prediction on the test set by searching the tree

Wine quality rating is measured in the range of 0-9. In our dataset, we only keep the samples for quality ratings 5, 6 and 7. The 11 objective features are listed as follows [1]:

f_acid : fixed acidity

v_acid : volatile acidity

c_acid : citric acid

res_sugar : residual sugar

chlorides : chlorides

fs_dioxide : free sulfur dioxide

ts_dioxide : total sulfur dioxide

density : density

pH : pH

sulphates : sulphates

alcohol : alcohol

Explanation of the Data.

train: The first 11 columns represent the 11 features and the 12th column is the wine quality. A sample is depicted as follows:

f_acid

v_acid

c_acid

res_sugar

chlorides

fs_dioxide

ts_dioxide

density

 

sulphates

alcohol

quality

8.10

0.270

0.41

1.45

0.033

11.0

63.0

0.9**80

2.99

0.56

12.0

5

8.60

0.230

0.40

4.20

0.035

17.0

109.0

0.99**0

3.14

0.53

9.7

5

7.**

0.180

0.74

1.20

0.040

16.0

75.0

0.99200

3.18

0.63

10.8

5

8.30

0.420

0.62

19.25

0.040

41.0

172.0

1.00020

2.98

0.67

9.7

5

6.50

0.310

0.14

7.50

0.044

34.0

133.0

0.99550

3.22

0.50

9.5

5

test: The first 11 columns represent the 11 features and the 12th column is the wine quality. A sample is depicted as follows:

f_acid

v_acid

c_acid

res_sugar

chlorides

fs_dioxide

ts_dioxide

density

pH

sulphates

alcohol

7.0

0.360

0.14

11.60

0.043

35.0

228.0

0.99770

3.13

0.51

8.**0000

6.3

0.270

0.18

7.70

0.048

45.0

186.0

0.99620

3.23

0.**

9.000000

7.2

0.2**

0.20

7.70

0.046

51.0

174.0

0.99582

3.16

0.52

9.500000

7.1

0.140

0.35

1.40

0.039

24.0

128.0

0.99212

2.97

0.68

10.400000

7.6

0.480

0.28

10.40

0.049

57.0

205.0

0.99748

3.24

0.45

9.300000

1.1 1NN (K-d Tree)

From the given training data, our goal is to learn a function that can predict the wine quality rating of a wine sample, based on the objective features. In this assignment, the predictor function will be constructed as a k-d tree. Since the attributes (objective features) are continuously valued, you shall apply the k-d tree algorithm for continuous data, as outlined in Algorithms 1. It is the same as taught in the lecture. Once the tree is constructed, you will search the tree to find the **nearest neighbour of a query point and label the query point. Please refer to the search logic taught in the lecture to write your code for the 1NN search.

 

Algorithm 1 BuildKdTree(P, D) Require: A set of points P of M dimensions and current depth D. 1: if P is empty then 2: return null 3: else if P only has one data point then 4: Create new node node 5: node.d ← d 6: node.val ← val 7: node.point ← current point 8: return node 9: else 10: d ← D mod M 11: val ← Median value along dimension among points in P. 12: Create new node node. 13: node.d ← d 14: node.val ← val 15: node.point ← point at the median along dimension d 16: node.left ← BuildKdTree(points in P for which value at dimension d is less than or equal to val, D+1) 17: node.right ← BuildKdTree(points in P for which value at dimension d is greater than val, D+ 1) 18: return node 19: end if

Note: Sorting is not necessary in some cases depending on your implementation. Please figure out whether your code needs to sort the number first. Also, if you compute the median by yourself, when there’s an even number of points, say [1,2,3,4], the median is 2.5.

 

1.2 Deliverable

Write your k-d tree program in Python 3.6.9 in a file called nn_kdtree.py. Your program must be able to run as follows:

$ python nn_kdtree.py [train] [test] [dimension]

The inputs/options to the program are as follows:

[train] specifies the path to a set of the training data file

[test] specifies the path to a set of testing data file

[dimension] is used to decide which dimension to start the comparison. (Algorithm 1)

Given the inputs, your program must construct a k-d tree (following the prescribed algorithms) using the training data, then predict the quality rating of each of the wine samples in the testing data. Your program must then print to standard output (i.e., the command prompt) the list of predicted wine quality ratings, vertically based on the order in which the testing cases appear in [test].

1.3 Python Libraries

You are allowed to use the Python standard library to write your k-d tree learning program (see https://docs.python.org/3/library/(https://docs.python.org/3/library/) for the components that make up the Python v3.6.9 standard library). In addition to the standard library, you are allowed to use NumPy and Pandas. Note that the marking program will not be able to run your program to completion if other third-party libraries are used. You are NOT allowed to use implemented tree structures from any Python package, otherwise the mark will be set to 0.

1.4 Submission

You must submit your program files on Gradescope. Please use the course code NPD6JD to enroll in the course. Instructions on accessing Gradescope and submitting assignments are provided at https://help.gradescope.com/article/5d3ifaeqi4-student-canvas (https://help.gradescope.com/article/5d3ifaeqi4-student-canvas) .

For undergraduates, please submit your k-d tree program (nn_kdtree.py) to Assignment 2 - UG.

1.5 Expected Run Time

Your program must be able to terminate within 600 seconds on the sample data given.

 

1.6 Debugging Suggestions

Step-by-step debugging by checking intermediate values/results will help you to identify the problems of your code. This function is enabled by most of the Python IDE. If not in your case, you could also print the intermediate values out. You could use sample data or create data in the same format for debugging

1.7 Assessment

Gradescope will compile and run your code on several test problems. If it passes all tests, you will get 15% (undergrads) or 12% (postgrads) of the overall course mark. For undergraduates, bonus marks of 3% will be awarded if Section 2 is completed correctly.

There will be no further manual inspection/grading of your program to award marks based on coding style, commenting, or “amount” of code written.

1.8 Using other source code

You may not use other source code for this assignment. All submitted code must be your own work written from scratch. Only by writing the solution yourself will you fully understand the concept.

1.9 Due date and late submission policy

This assignment is due by 11:59 pm Friday 3 May 2024. If your submission is late, the maximum mark you can obtain will be reduced by 25% per day (or part thereof) past the due date or any extension you are granted.

Part 2 Wine Quality Prediction with Random Forest

For postgraduate students, completing this section will give you the remaining 3% of the assignment marks. In this task, you will extend your knowledge learned from k-d tree to k-d forest. The process for a simplified k-d forest given N input-output pairs is:

1. Randomly select a set of N' distinct samples (i.e., no duplicates) where N' = N' * 80% (round to integer). This dataset is used for constructing a k-d tree (i.e., the root node of the k-d tree)

 

2. Build a k-d tree on the dataset from (1) and apply Algorithm 1.

3. Repeat (1) and (2) until reaching the maximum number of trees.

This process is also shown in Algorithm 2. In k-d forest learning, a sample set is used to construct a k-d tree. That is to say, different trees in the forest could have different root data. For prediction, the k-d forest will choose the most voted label as its prediction. For the wine quality prediction task, you shall apply Algorithm 2 for k-d forest learning and apply Algorithm 3 to predict the wine quality for a new wine sample. To generate samples, please use the following (incomplete) code to generate the same samples as our testing scripts:

import random ... N= ... N’=... index_list = [i for i in range(0, N)] # create a list of indexes for all data sample_indexes = [] for j in range(0,n_tree): random.seed(rand_seed+j) # random_seed is one of the input parameters subsample_idx = random.sample(index_list, k=N’) # create unique N’ indices sample_indexes = sample_indexes + subsample_id Algorithm 2 KdForest(data, d_list, rand_seed) Require:data in the form. of N input-output pairs ,d_list a list of depth 1: forest ← [] 2: n_trees ← len(d_list) 3: sample_indexes ← N'*n_trees integers with value in [0,N) generated by using above method 4: count ← 0 5: for count < n_trees do 6: sampled_data ← N' data pairs selected by N' indexes from sample_indexes sequentially 7: n = BuildKdTree(sampled_data, d_list[count]) ⇒ Algorithm 1 8: forest.append(n)

 

9: end for 10: return forest Algorithm 3 Predict_KdForest(forest, data) Require: forest is a list of tree roots, data in the form. of attribute values x. 1: labels ← [] 2: for Each tree n in the forest do 3: label ← 1NN search on tree n 4: labels.append(n) 5: end for 6: return the most voted label in labels

2.1 Deliverables

Write your random forest program in Python 3.6.9 in a file called nn_kdforest.py. Your program must be able to run as follows

$ python nn_kdforest.py [train] [test] [random_seed] [d_list]

The inputs/options to the program are as follows:

[train] specifies the path to a set of the training data file

[test] specifies the path to a set of testing data file

[random_seed] is the seed value generate random values.

[d_list] is a list of depth values (in Algorithm 2 n_trees==len(d_list))

Given the inputs, your program must learn a random forest (following the prescribed algorithms) using the training data, then predict the quality rating of each wine sample in the testing data. Your program must then print to standard output (i.e., the command prompt) the list of predicted wine quality ratings, vertically based on the order in which the testing cases appear in [test].

Submit your program in the same way as the submission for Sec. 1. For postgraduates, please submit your learning programs (nn_kdtree.py and nn_kdforest.py) to Assignment 2 - PG. The due date, late submission policy, and code reuse policy are also the same as in Sec 1.

 

2.2 Expected Run Time

Your program must be able to terminate within 600 seconds on the sample data given.

2.3 Debugging Suggestions

In addition to Sec. 1.6, another value worth checking when debugging is (but not limited to): the sample_indexes – by setting a random seed, the indexes should be the same each time you run the code

2.4 Assessment

Gradescope will compile and run your code on several test problems. If it passes all tests, you will get 3% of the overall course mark.

請加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp







 

掃一掃在手機打開當前頁
  • 上一篇:代寫FINC5090、代做Python語言編程
  • 下一篇:MGMT20005代寫、c/c++,Python程序代做
  • 無相關信息
    合肥生活資訊

    合肥圖文信息
    流體仿真外包多少錢_專業CFD分析代做_友商科技CAE仿真
    流體仿真外包多少錢_專業CFD分析代做_友商科
    CAE仿真分析代做公司 CFD流體仿真服務 管路流場仿真外包
    CAE仿真分析代做公司 CFD流體仿真服務 管路
    流體CFD仿真分析_代做咨詢服務_Fluent 仿真技術服務
    流體CFD仿真分析_代做咨詢服務_Fluent 仿真
    結構仿真分析服務_CAE代做咨詢外包_剛強度疲勞振動
    結構仿真分析服務_CAE代做咨詢外包_剛強度疲
    流體cfd仿真分析服務 7類仿真分析代做服務40個行業
    流體cfd仿真分析服務 7類仿真分析代做服務4
    超全面的拼多多電商運營技巧,多多開團助手,多多出評軟件徽y1698861
    超全面的拼多多電商運營技巧,多多開團助手
    CAE有限元仿真分析團隊,2026仿真代做咨詢服務平臺
    CAE有限元仿真分析團隊,2026仿真代做咨詢服
    釘釘簽到打卡位置修改神器,2026怎么修改定位在范圍內
    釘釘簽到打卡位置修改神器,2026怎么修改定
  • 短信驗證碼 寵物飼養 十大衛浴品牌排行 suno 豆包網頁版入口 wps 目錄網 排行網

    關于我們 | 打賞支持 | 廣告服務 | 聯系我們 | 網站地圖 | 免責聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網 版權所有
    ICP備06013414號-3 公安備 42010502001045

    国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看
    久精品免费视频| 一区二区三区四区在线视频| 欧美国产亚洲一区| 一区二区三区四区久久| 精品不卡在线| 色综合色综合网色综合| 伦理中文字幕亚洲| 欧美成人精品在线| 国产精品久久一区| 国产精品日韩高清| 久久最新资源网| 久久国产一区二区三区| 久久久久久久久久久成人| 国产国语videosex另类| 国产成人自拍视频在线观看| 久久久免费在线观看| 久久久亚洲精品无码| 国产成人综合精品在线| 91黄在线观看| 99爱视频在线| 久久男人资源视频| 久久精品国产第一区二区三区最新章节 | 国产乱码精品一区二区三区中文| 欧美成人蜜桃| 国产一区欧美二区三区| 国产日产欧美a一级在线| 国产精品自拍偷拍| 久在线观看视频| 国产精品视频久久久| 国产精品吊钟奶在线| 亚洲一区二区在线观| 日本在线观看天堂男亚洲| 欧美中文字幕视频| 国产四区在线观看| 99久热在线精品视频| 国产福利片一区二区| 久久精品国产亚洲一区二区| 国产精品视频一区二区三区经 | 国产精品av网站| 久久99导航| 欧美成人亚洲成人| 亚洲精品免费av| 欧美激情视频一区二区三区| 国产欧美日韩综合精品| 久久伊人资源站| 国产精品久久久久久久久久直播 | 一区二区三区日韩视频| 色综合666| 国产做受69高潮| 国产精品专区一| 国产高清在线一区二区| 国产精品久久久久久久久久免费| 久久久久国产精品www| 色播亚洲视频在线观看| 麻豆av一区二区三区| 131美女爱做视频| 国产精品成人一区| 日韩国产精品毛片| 国产精品有限公司| 精品国产一区二区三区久久久狼| 综合操久久久| 黄色片网址在线观看| 久久这里只有精品8| 欧美成人精品在线播放| 午夜啪啪免费视频| 精品午夜一区二区| 久久久久久久久久久久久国产精品| 欧美激情视频一区二区| 欧美日韩在线观看一区| 久久一区免费| 九九热精品视频国产| 欧美日韩亚洲一区二区三区四区 | 少妇人妻无码专区视频| 国产在线xxxx| 深夜福利国产精品| 日韩av高清在线看片| 不卡影院一区二区| 精品久久久久久一区二区里番| 欧美在线亚洲一区| 国产av熟女一区二区三区| 在线免费一区| 国内精品久久影院| 日韩有码在线电影| 日韩av观看网址| 91国内在线视频| 亚洲最大的av网站| 国产乱码精品一区二区三区卡| 久久国产一区| 五月天亚洲综合情| 97人人澡人人爽| 中文字幕日本最新乱码视频| 国精产品一区一区三区视频 | 日韩中文字幕一区二区| 国产噜噜噜噜噜久久久久久久久 | 男人亚洲天堂网| 日韩中文字幕在线免费观看| 日本一本草久p| 国产v片免费观看| 日韩人妻一区二区三区蜜桃视频| 久久久久久久av| 欧美成人综合一区| 久久成人免费视频| 成人精品久久久| 五月天综合网| 日韩一区二区精品视频| 欧美日韩一道本| 精品免费国产| 国产精品自拍网| 午夜肉伦伦影院| 久久久久久欧美精品色一二三四 | 宅男一区二区三区| 国产一区二区视频播放| 国产精品久久久久久久久久久久冷| 日本精品一区二区三区高清 久久| 91精品视频在线看| 日日橹狠狠爱欧美超碰| 久久久最新网址| 日本欧美中文字幕| 久久天天狠狠| 亚洲综合色激情五月| 99亚洲精品视频| 少妇大叫太大太粗太爽了a片小说| 97精品国产97久久久久久| 亚洲综合av一区| 91久久久久久久久久久久久| 亚洲高清视频一区| 久久久免费精品| 日韩欧美亚洲在线| 日韩亚洲国产中文字幕| 欧美日韩一区二区三区在线视频| 国产精品免费一区豆花| 精品无人乱码一区二区三区的优势| 久久综合色影院| 丰满爆乳一区二区三区| 欧美日韩国产999| αv一区二区三区| 色婷婷精品国产一区二区三区 | 国产激情在线看| 欧美在线视频一区| 欧美日韩国产成人在线观看| 粉嫩精品一区二区三区在线观看| 亚洲一区二区三区香蕉| 99中文字幕| 日韩午夜视频在线观看| 久久九九国产视频| 国产一区二区丝袜高跟鞋图片| 国产精品美女诱惑| 国产欧美精品在线| 色哺乳xxxxhd奶水米仓惠香| 日韩综合视频在线观看| 黄www在线观看| 无码人妻精品一区二区三区66 | 成人精品久久av网站| 日本免费一级视频| 国产精品高潮在线| 国产精品av免费| 亚洲永久一区二区三区在线| 国产精品视频成人| 97激碰免费视频| 欧美日韩一区二区视频在线观看| 国产精品日韩欧美| 97精品一区二区三区| 欧美日韩国产三区| 亚洲一区二区三区欧美| 久久综合伊人77777蜜臀| 国产免费一区二区三区香蕉精| 日日骚一区二区网站| 欧美精品一二区| 久久精品二区| 国产玖玖精品视频| 激情综合在线观看| 亚洲国产精品综合| 国产精品高清免费在线观看| 91国视频在线| 国产一区二区视频在线观看 | 亚洲成人网上| 国产精品毛片va一区二区三区| 91精品综合久久久久久五月天| 欧美日韩亚洲一二三| 国产99视频在线观看| 国产精品偷伦视频免费观看国产| 成人乱人伦精品视频在线观看| 欧美日韩系列| 欧美一区二区三区精美影视| 蜜臀久久99精品久久久无需会员 | 操日韩av在线电影| 久久99精品久久久水蜜桃| 国产伦精品一区二区三区四区免费 | 久久手机精品视频| 久久亚洲a v| 成人精品一二区| 国产免费一区二区三区在线观看| 亚洲欧美日韩精品久久久| 国产精品成人一区二区三区吃奶| 久久久久久久久一区二区| 91免费的视频在线播放| 男人天堂a在线| 欧美在线一级va免费观看| 日本精品久久久久久久久久| 日本免费成人网|