国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看

合肥生活安徽新聞合肥交通合肥房產生活服務合肥教育合肥招聘合肥旅游文化藝術合肥美食合肥地圖合肥社保合肥醫院企業服務合肥法律

代寫3007_7059 Artificial Intelligence 3007_7059
代寫3007_7059 Artificial Intelligence 3007_7059

時間:2024-09-08  來源:合肥網hfw.cc  作者:hfw.cc 我要糾錯


Assignment 2: Artificial Intelligence (3007_7059 Combined)

Assignment 2

The dataset is available here

(https://myuni.adelaide.edu.au/courses/95211/files/1453***/download)

Part 1 Wine Quality Prediction with 1NN (K-d Tree)

Wine experts evaluate the quality of wine based on sensory data. We could also collect the features of wine from objective tests, thus the objective features could be used to predict the expert’s judgment, which is the quality rating of the wine. This could be formed as a supervised learning problem with the objective features as the data features and wine quality rating as the data labels.

In this assignment, we provide objective features obtained from physicochemical statistics for each white wine sample and its corresponding rating provided by wine experts. You are expected to implement the k-d tree (KDT) and use the training set to train your k-d tree, then provide wine quality prediction on the test set by searching the tree

Wine quality rating is measured in the range of 0-9. In our dataset, we only keep the samples for quality ratings 5, 6 and 7. The 11 objective features are listed as follows [1]:

f_acid : fixed acidity

v_acid : volatile acidity

c_acid : citric acid

res_sugar : residual sugar

chlorides : chlorides

fs_dioxide : free sulfur dioxide

ts_dioxide : total sulfur dioxide

density : density

pH : pH

sulphates : sulphates

alcohol : alcohol

Explanation of the Data.

train: The first 11 columns represent the 11 features and the 12th column is the wine quality. A sample is depicted as follows:

f_acid

v_acid

c_acid

res_sugar

chlorides

fs_dioxide

ts_dioxide

density

 

sulphates

alcohol

quality

8.10

0.270

0.41

1.45

0.033

11.0

63.0

0.9**80

2.99

0.56

12.0

5

8.60

0.230

0.40

4.20

0.035

17.0

109.0

0.99**0

3.14

0.53

9.7

5

7.**

0.180

0.74

1.20

0.040

16.0

75.0

0.99200

3.18

0.63

10.8

5

8.30

0.420

0.62

19.25

0.040

41.0

172.0

1.00020

2.98

0.67

9.7

5

6.50

0.310

0.14

7.50

0.044

34.0

133.0

0.99550

3.22

0.50

9.5

5

test: The first 11 columns represent the 11 features and the 12th column is the wine quality. A sample is depicted as follows:

f_acid

v_acid

c_acid

res_sugar

chlorides

fs_dioxide

ts_dioxide

density

pH

sulphates

alcohol

7.0

0.360

0.14

11.60

0.043

35.0

228.0

0.99770

3.13

0.51

8.**0000

6.3

0.270

0.18

7.70

0.048

45.0

186.0

0.99620

3.23

0.**

9.000000

7.2

0.2**

0.20

7.70

0.046

51.0

174.0

0.99582

3.16

0.52

9.500000

7.1

0.140

0.35

1.40

0.039

24.0

128.0

0.99212

2.97

0.68

10.400000

7.6

0.480

0.28

10.40

0.049

57.0

205.0

0.99748

3.24

0.45

9.300000

1.1 1NN (K-d Tree)

From the given training data, our goal is to learn a function that can predict the wine quality rating of a wine sample, based on the objective features. In this assignment, the predictor function will be constructed as a k-d tree. Since the attributes (objective features) are continuously valued, you shall apply the k-d tree algorithm for continuous data, as outlined in Algorithms 1. It is the same as taught in the lecture. Once the tree is constructed, you will search the tree to find the **nearest neighbour of a query point and label the query point. Please refer to the search logic taught in the lecture to write your code for the 1NN search.

 

Algorithm 1 BuildKdTree(P, D) Require: A set of points P of M dimensions and current depth D. 1: if P is empty then 2: return null 3: else if P only has one data point then 4: Create new node node 5: node.d ← d 6: node.val ← val 7: node.point ← current point 8: return node 9: else 10: d ← D mod M 11: val ← Median value along dimension among points in P. 12: Create new node node. 13: node.d ← d 14: node.val ← val 15: node.point ← point at the median along dimension d 16: node.left ← BuildKdTree(points in P for which value at dimension d is less than or equal to val, D+1) 17: node.right ← BuildKdTree(points in P for which value at dimension d is greater than val, D+ 1) 18: return node 19: end if

Note: Sorting is not necessary in some cases depending on your implementation. Please figure out whether your code needs to sort the number first. Also, if you compute the median by yourself, when there’s an even number of points, say [1,2,3,4], the median is 2.5.

 

1.2 Deliverable

Write your k-d tree program in Python 3.6.9 in a file called nn_kdtree.py. Your program must be able to run as follows:

$ python nn_kdtree.py [train] [test] [dimension]

The inputs/options to the program are as follows:

[train] specifies the path to a set of the training data file

[test] specifies the path to a set of testing data file

[dimension] is used to decide which dimension to start the comparison. (Algorithm 1)

Given the inputs, your program must construct a k-d tree (following the prescribed algorithms) using the training data, then predict the quality rating of each of the wine samples in the testing data. Your program must then print to standard output (i.e., the command prompt) the list of predicted wine quality ratings, vertically based on the order in which the testing cases appear in [test].

1.3 Python Libraries

You are allowed to use the Python standard library to write your k-d tree learning program (see https://docs.python.org/3/library/(https://docs.python.org/3/library/) for the components that make up the Python v3.6.9 standard library). In addition to the standard library, you are allowed to use NumPy and Pandas. Note that the marking program will not be able to run your program to completion if other third-party libraries are used. You are NOT allowed to use implemented tree structures from any Python package, otherwise the mark will be set to 0.

1.4 Submission

You must submit your program files on Gradescope. Please use the course code NPD6JD to enroll in the course. Instructions on accessing Gradescope and submitting assignments are provided at https://help.gradescope.com/article/5d3ifaeqi4-student-canvas (https://help.gradescope.com/article/5d3ifaeqi4-student-canvas) .

For undergraduates, please submit your k-d tree program (nn_kdtree.py) to Assignment 2 - UG.

1.5 Expected Run Time

Your program must be able to terminate within 600 seconds on the sample data given.

 

1.6 Debugging Suggestions

Step-by-step debugging by checking intermediate values/results will help you to identify the problems of your code. This function is enabled by most of the Python IDE. If not in your case, you could also print the intermediate values out. You could use sample data or create data in the same format for debugging

1.7 Assessment

Gradescope will compile and run your code on several test problems. If it passes all tests, you will get 15% (undergrads) or 12% (postgrads) of the overall course mark. For undergraduates, bonus marks of 3% will be awarded if Section 2 is completed correctly.

There will be no further manual inspection/grading of your program to award marks based on coding style, commenting, or “amount” of code written.

1.8 Using other source code

You may not use other source code for this assignment. All submitted code must be your own work written from scratch. Only by writing the solution yourself will you fully understand the concept.

1.9 Due date and late submission policy

This assignment is due by 11:59 pm Friday 3 May 2024. If your submission is late, the maximum mark you can obtain will be reduced by 25% per day (or part thereof) past the due date or any extension you are granted.

Part 2 Wine Quality Prediction with Random Forest

For postgraduate students, completing this section will give you the remaining 3% of the assignment marks. In this task, you will extend your knowledge learned from k-d tree to k-d forest. The process for a simplified k-d forest given N input-output pairs is:

1. Randomly select a set of N' distinct samples (i.e., no duplicates) where N' = N' * 80% (round to integer). This dataset is used for constructing a k-d tree (i.e., the root node of the k-d tree)

 

2. Build a k-d tree on the dataset from (1) and apply Algorithm 1.

3. Repeat (1) and (2) until reaching the maximum number of trees.

This process is also shown in Algorithm 2. In k-d forest learning, a sample set is used to construct a k-d tree. That is to say, different trees in the forest could have different root data. For prediction, the k-d forest will choose the most voted label as its prediction. For the wine quality prediction task, you shall apply Algorithm 2 for k-d forest learning and apply Algorithm 3 to predict the wine quality for a new wine sample. To generate samples, please use the following (incomplete) code to generate the same samples as our testing scripts:

import random ... N= ... N’=... index_list = [i for i in range(0, N)] # create a list of indexes for all data sample_indexes = [] for j in range(0,n_tree): random.seed(rand_seed+j) # random_seed is one of the input parameters subsample_idx = random.sample(index_list, k=N’) # create unique N’ indices sample_indexes = sample_indexes + subsample_id Algorithm 2 KdForest(data, d_list, rand_seed) Require:data in the form. of N input-output pairs ,d_list a list of depth 1: forest ← [] 2: n_trees ← len(d_list) 3: sample_indexes ← N'*n_trees integers with value in [0,N) generated by using above method 4: count ← 0 5: for count < n_trees do 6: sampled_data ← N' data pairs selected by N' indexes from sample_indexes sequentially 7: n = BuildKdTree(sampled_data, d_list[count]) ⇒ Algorithm 1 8: forest.append(n)

 

9: end for 10: return forest Algorithm 3 Predict_KdForest(forest, data) Require: forest is a list of tree roots, data in the form. of attribute values x. 1: labels ← [] 2: for Each tree n in the forest do 3: label ← 1NN search on tree n 4: labels.append(n) 5: end for 6: return the most voted label in labels

2.1 Deliverables

Write your random forest program in Python 3.6.9 in a file called nn_kdforest.py. Your program must be able to run as follows

$ python nn_kdforest.py [train] [test] [random_seed] [d_list]

The inputs/options to the program are as follows:

[train] specifies the path to a set of the training data file

[test] specifies the path to a set of testing data file

[random_seed] is the seed value generate random values.

[d_list] is a list of depth values (in Algorithm 2 n_trees==len(d_list))

Given the inputs, your program must learn a random forest (following the prescribed algorithms) using the training data, then predict the quality rating of each wine sample in the testing data. Your program must then print to standard output (i.e., the command prompt) the list of predicted wine quality ratings, vertically based on the order in which the testing cases appear in [test].

Submit your program in the same way as the submission for Sec. 1. For postgraduates, please submit your learning programs (nn_kdtree.py and nn_kdforest.py) to Assignment 2 - PG. The due date, late submission policy, and code reuse policy are also the same as in Sec 1.

 

2.2 Expected Run Time

Your program must be able to terminate within 600 seconds on the sample data given.

2.3 Debugging Suggestions

In addition to Sec. 1.6, another value worth checking when debugging is (but not limited to): the sample_indexes – by setting a random seed, the indexes should be the same each time you run the code

2.4 Assessment

Gradescope will compile and run your code on several test problems. If it passes all tests, you will get 3% of the overall course mark.

請加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp







 

掃一掃在手機打開當前頁
  • 上一篇:代寫FINC5090、代做Python語言編程
  • 下一篇:MGMT20005代寫、c/c++,Python程序代做
  • 無相關信息
    合肥生活資訊

    合肥圖文信息
    流體仿真外包多少錢_專業CFD分析代做_友商科技CAE仿真
    流體仿真外包多少錢_專業CFD分析代做_友商科
    CAE仿真分析代做公司 CFD流體仿真服務 管路流場仿真外包
    CAE仿真分析代做公司 CFD流體仿真服務 管路
    流體CFD仿真分析_代做咨詢服務_Fluent 仿真技術服務
    流體CFD仿真分析_代做咨詢服務_Fluent 仿真
    結構仿真分析服務_CAE代做咨詢外包_剛強度疲勞振動
    結構仿真分析服務_CAE代做咨詢外包_剛強度疲
    流體cfd仿真分析服務 7類仿真分析代做服務40個行業
    流體cfd仿真分析服務 7類仿真分析代做服務4
    超全面的拼多多電商運營技巧,多多開團助手,多多出評軟件徽y1698861
    超全面的拼多多電商運營技巧,多多開團助手
    CAE有限元仿真分析團隊,2026仿真代做咨詢服務平臺
    CAE有限元仿真分析團隊,2026仿真代做咨詢服
    釘釘簽到打卡位置修改神器,2026怎么修改定位在范圍內
    釘釘簽到打卡位置修改神器,2026怎么修改定
  • 短信驗證碼 寵物飼養 十大衛浴品牌排行 suno 豆包網頁版入口 wps 目錄網 排行網

    關于我們 | 打賞支持 | 廣告服務 | 聯系我們 | 網站地圖 | 免責聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網 版權所有
    ICP備06013414號-3 公安備 42010502001045

    国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看
    国产精品99久久久久久人| 在线观看免费黄色片| 日本久久久久久久久| 亚洲成人一区二区三区| 亚洲一区二区三区香蕉| 欧美日韩aaaa| 亚洲影视中文字幕| 亚洲中文字幕无码av永久| 国产成人精品一区二区| 九九热久久66| 久久这里只有精品23| 久久免费视频在线观看| 国产一区二区三区四区五区加勒比| 日本亚洲欧洲色α| 日本精品久久久久中文字幕| 日韩欧美亚洲日产国产| 欧美日韩精品久久久免费观看| 欧美极品欧美精品欧美图片| 激情深爱综合网| 国产剧情久久久久久| 成人久久久久爱| 久久露脸国产精品| 久久手机精品视频| 久久久久久18| 欧美区二区三区| 无码av天堂一区二区三区| 日本高清视频一区| 国产伦精品一区二区三区在线| 91精品天堂| 国产精品女主播| 久久国产精品影视| 日本精品久久久久久久| 欧美欧美一区二区| αv一区二区三区| 国产a一区二区| 久久97精品久久久久久久不卡| 亚洲精品天堂成人片av在线播放| 日韩视频 中文字幕| 日韩亚洲欧美成人| 国产精品成人在线| 国产不卡一区二区三区在线观看| 国产精品视频1区| 欧美激情视频一区| 日韩精品不卡| 国产成人精品999| 国产精品久久久久久久电影| 无码aⅴ精品一区二区三区浪潮| 黄色高清视频网站| www.日本久久久久com.| 色一情一乱一乱一区91| 2019日韩中文字幕mv| 亚洲伊人久久大香线蕉av| 国产一区二区视频在线观看| 久久久精品视频在线观看| 久久91精品国产91久久久| 精品人伦一区二区三区| 国产精品美女www爽爽爽视频| 日韩福利一区二区三区| 久久这里只有精品8| 天天操天天干天天玩| 久久综合久久色| 日本久久久久久久久久久| 久久一区免费| 日本一区二区精品视频| 久久久久久亚洲精品中文字幕| 日本一本中文字幕| 日韩在线观看精品| 黄www在线观看| 欧美精品在线免费| 国产一区二区四区| 亚洲欧美国产一区二区| 久久国产亚洲精品无码| 欧美性久久久久| 精品福利影视| 国产成人在线一区| 日韩久久久久久久久久久久久| 色777狠狠综合秋免鲁丝| 欧美日韩成人一区二区三区| 国产精品三级美女白浆呻吟| 免费av网址在线| 亚洲精品日韩av| 久久手机免费视频| 91精品视频在线| 欧美h视频在线观看| 超碰97在线播放| 欧美在线视频a| 久久精品国产sm调教网站演员| 欧美日韩在线不卡一区| 亚洲精品一品区二品区三品区| 久久久久日韩精品久久久男男| 国语自产精品视频在线看一大j8| 亚洲一区二区在线观| 色噜噜国产精品视频一区二区| 国产日韩成人内射视频| 性色av一区二区三区| 国产精品久久久久久久免费大片| 分分操这里只有精品| 蜜桃传媒一区二区三区| 欧美诱惑福利视频| 亚洲视频电影| 九九热精品视频国产| 国产精品国产三级国产aⅴ浪潮 | 国产成人精品电影| 国产女主播一区二区三区| 欧美日韩亚洲在线| 人人妻人人做人人爽| 欧美一区二区三区免费视| 亚洲精品在线观看免费| 亚洲在线视频一区二区| 精品国产一区二区三区四区精华 | 国产成人精品一区二区| 久久久久久中文| 久久久久久久久久婷婷| 国产成人精品在线播放| 国产精品美女免费看| 国产精品久久久久久久久久免费| 久久精品国产一区| 国产精品高潮呻吟视频| 精品中文字幕在线| 亚洲直播在线一区| 日韩视频在线免费播放| 欧美极品jizzhd欧美| 欧美专区第一页| 国产裸体写真av一区二区| 久久综合九色综合久99| 久久精品国产亚洲精品2020| 免费av在线一区| 午夜精品一区二区三区四区| 久久久久久国产三级电影| 国产精品久久久久久久app | 久久天堂电影网| 在线一区亚洲| 欧美中日韩在线| 国产精品又粗又长| 久久久久久久久久国产| 久久精品国产精品亚洲| 欧美成年人视频网站欧美| 丁香六月激情网| 国产一区二区三区色淫影院| 国产freexxxx性播放麻豆| 久久国产精品99国产精| 日韩精品一区二区免费| av免费观看网| 精品久久蜜桃| 黄色国产一级视频| www.精品av.com| 色欲av无码一区二区人妻| 国产在线一区二区三区播放 | 日韩高清av| 国产欧美 在线欧美| www.xxxx欧美| 日本91av在线播放| 97色在线播放视频| 色中色综合影院手机版在线观看| 欧美亚洲国产视频小说| 久久国产精品 国产精品| 偷拍盗摄高潮叫床对白清晰| www.日日操| 亚洲va码欧洲m码| 91干在线观看| 色就是色欧美| 日韩一区二区欧美| 欧美精品久久| 久久夜色精品国产亚洲aⅴ| 欧美日本韩国在线| 国产精品日韩欧美一区二区三区 | 国产成人看片| 欧美日韩亚洲免费| 国产精品免费看久久久香蕉| 精品一区二区三区国产| 久久国产精品免费视频| 99国产高清| 日韩女优人人人人射在线视频| 久久精品人人爽| 国产精品在线看| 久久久精品有限公司| 亚洲精品一区二| 国产成人av影视| 蜜臀久久99精品久久久酒店新书| 国产精品对白一区二区三区| 国产原创欧美精品| 性日韩欧美在线视频| 国产成人精品最新| 国产乱码精品一区二区三区卡| 午夜欧美大片免费观看| 精品国产拍在线观看| 国产区精品视频| 日本久久久久亚洲中字幕| 国产精品久久婷婷六月丁香| 97国产精品视频| 欧美精品卡一卡二| 一区二区三区四区欧美| 国产成人一区二| 国产欧美日韩免费看aⅴ视频| 亚洲熟妇av日韩熟妇在线| www.亚洲一区| 久久久免费视频网站| 国产天堂在线播放| 欧美在线一区二区三区四| 自拍视频一区二区三区|