Schéma de Structure Communal de Chaumont-GistouxV3

By
Published by

Commune de Chaumont-Gistoux Phase 1 SCHEMA DE STRUCTURE COMMUNAL Schéma de Structure Communal de Chaumont-GistouxV3.doc Page 23 / 695 INTRODUCTION 1 Introduction 1.1 PREAMBULE 1.1.1 Objectifs Les objectifs de ce chapitre sont : Présenter le but de la procédure du SSC; Présenter une vue d'ensemble sur le contexte de la commune; Présenter la structure du rapport de la première phase « Diagnostic de la commune »; Présenter la méthodologie abordée.
  • schema de structure communal
  • axes de développement potentiel
  • habitat rural en habitat
  • base des données
  • bases de données
  • base donnée
  • base de donnée
  • base de données
  • bases de donnée
  • base données
  • analyses
  • analyse
  • village
  • villages
  • territoires
  • territoire
  • communes
  • commune
  • habitats
  • habitat
Published : Wednesday, March 28, 2012
Reading/s : 39
Origin : chaumont-gistoux.be
Number of pages: 8
See more See less
SE 446
Daniel Weld Xiao Ling Congle Zhang
Machine Learning
Study of algorithms that improve their performance at some task with experience
Data
©20052009 Carlos Guestrin
Machine Learning
Understanding
Exponential Growth in Data
Data
©20052009 Carlos Guestrin
Machine Learning
Understanding
1
3
5
©20052009 Carlos Guestrin
Is this topic important?
2
4
Supremacy of Machine Learning
Machine learning is preferred approach to Speech recognition, Natural language processing Web search – result ranking Computer vision Medical outcomes analysis o o con ro Computational biology Sensor networks … This trend is accelerating Improved machine learning algorithms Improved data capture, networking, faster computers Software too complex to write by hand New sensors / IO devices Demand for selfcustomization to user, environment ©20052009 Carlos Guestrin
6
1
Prerequisites
Probabilities Distributions, densities, marginalization… Basic statistics Moments, typical distributions, regression… Algorithms Programming Mostly your choice of language, but Python (NumPy) + Matlab will be very useful We provide some background, but the class will be fast paced
Ability to deal with “abstract mathematical concepts”
©20052009 Carlos Guestrin
Text Books
Required Text: Pattern Recognition and Machine Learning; Chris Bishop Optional: Tom MitchellMachine Learnin The Elements of Statistical Learning: Data Mining, Inference, and Prediction; Trevor Hastie, Robert Tibshirani, Jerome Friedman Information Theory, Inference, and Learning Algorithms; David MacKay Website: Andrew Ng’s AI class videos Website: Tom Mitchell’s AI class videos
7
9
11
Syllabus
Covers a wide range of Machine Learning techniques – from basic to stateoftheart You will learn about the methods you heard about: Naïve Bayes, logistic regression, nearestneighbor, decision trees, boosting, neural nets, overfitting, regularization, dimensionality reduction, error bounds, loss function,  , , , ,  , , ,  supervised learning, HMMs, graphical models, active learning Covers algorithms, theory and applications It’s going to be fun and hard work
©20052009 Carlos Guestrin, D. Weld,
Staff
Two Great TAs: Fantastic resource for learning, interact with them!
Xiao Ling, CSE 610, xiaoling@cs Office hours: TBA
Congle Zhang, CSE 524, clzhang@cs Office hours: TBA
Administrative Assistant Alicen Smith, CSE 546, asmith@cs
Grading
4 homeworks (55%) First one goes out Fri 1/6/12 Start early, Start early, Start early, Start early, Start early, Start early, Start early, Start early, Start early, Start early Midterm (15%) Circa Feb 10 in class Final (30%) TBD by registrar
8
10
12
2
Labeled Exam les DiscreteClassification Function ContinuousRegression Function PolicyApprenticeship Learning
Clustering
Reward
Space of ML Problems
/25219/
Homeworks are hard, start earlyDue at the beginning of class Minus 33% credit for each day (or part of day) late All homeworksmust be handed in, even for zero credit
Collaboration You madiscussthe uestions Each student writes their own answers Write on your homework anyone with whom you collaborate Each student must write their own code for the programming part Please don’t search for answers on the web, Google, previous years’ homeworks, etc. Ask us if you are not sure if you can use a particular reference
13
18
Reinforcement Learning
prediction
Communication
data
Spam filtering
15
©2009 Carlos Guestrin
Homeworks
14
16
To email instructors, always use: cse446_instructor@cs
Subscribe:htt ://mailman.cs.washin e446
©20052009 Carlos Guestrin
Main discussion board htt s://catal st.uw.edu/ o ost/board/xlin
Urgent announcements
3
ton.edu/mailman/listinfo/cs
from data to discrete classes
Nothing
Type of Supervision (eg, Experience, Feedback)
19
Text classification
24
©2009 Carlos Guestrin
Object detection
©2009 Carlos Guestrin
Example training images for each orientation
(Prof. H. Schneiderman)
4
Company home page vs
23
Reading a noun v v
Weather prediction
20
vs Univeristy home page vs …
Testing
Training
©2009 Carlos Guestrin
©2009 Carlos Guestrin
predicting a numeric value
22
©2009 Carlos Guestrin
21
[Rustandi et al., 2005]
The classification pipeline
discovering structure in data
©2009 Carlos Guestrin
Measure empera ures a some locations Predict temperatures throughout the environment
Modeling sensor data
Set of Images
©2009 Carlos Guestrin
[Goldberger et al.]30
27
5
Clustering Data: Group similar things
©2009 Carlos Guestrin
Temperature
Weather prediction revisted
25
Stock market
Clustering images
26
©2009 Carlos Guestrin
[Guestrin et al. ’04]
©2009 Carlos Guestrin
28
Nothing
Labeled Exam les DiscreteClassification Function ContinuousRegression Function PolicyApprenticeship Learning
6
Reward
©2009 Carlos Guestrin
Reinforcement Learning
In Summary
Labeled Exam les Discrete Function Continuous Function PolicyApprenticeship Learning
[Ng et al. ’05]
35
31
37
Clustering
Type of Supervision (eg, Experience, Feedback)
Reinforcement Learning
36
Nothing
©2009 Carlos Guestrin
Clustering web search results
Reward
Type of Supervision (eg, Experience, Feedback)
©2009 Carlos Guestrin
Clustering
32
33
In Summary
Reinforcement learning An agent Makes sensor observations Must select action Receives rewards positive for “good” states negative for “bad” states
Learning to act
training by feedback
Reinforcement Learning
©20052009 Carlos Guestrin
Generalization
38
Hypotheses mustgeneralizeto correctly classify instances not in the training data.
consistent hypothesisthat does not generalize.
Why is Learning Possible?
Experience alone never justifies any conclusion about any unseen instance.
Learning occurs when PREJUDICEmeetsDATA!
© Daniel S. Weld
Learning a “Frobnitz”
40
42
41
Classifier
Hypothesis: Function for labeling examples
0.0
+
+
1.0
+
2.0
?
3.0
?
4.0
+ ?
5.0
6.0
ML = Function Approximation
c(x)
May not be any perfect fit Classification ~ discrete functions h(x) = contains(`nigeria’, x)contains(`wiretransfer’, x) h(x)
Bias
The nice word for prejudice is “bias”. Different from “Bias” in statistics
What kind of hypotheses will youconsider? What is allowablerangeof functions you use when approximating? What kind of hypotheses do youprefer?
© Daniel S. Weld
x
43
7
0.7
.
Accuracy
0.9
Hypothesis H isoverfitwhenH’ and H hassmallererror on training examples, but H hasbiggererror on test examples
Overfitting
45
49
Some Typical Biases
Hypothesis H isoverfitwhenH’ and H hassmallererror on training examples, but H hasbiggererror on test examples
On training data On test data
48
© Daniel S. Weld
Occam’s razor “It is needless to do more when less will suffice” – William of Occam, died 1349 of the Black plague MDL – Minimum description length Concepts can be approximated by ...conjunctionsof predicates ... bylinearfunctions ... byshortdecision trees
8
© Daniel S. Weld
Solve usin o timization Combinatorial Convex Linear Nasty
©20052009 Carlos Guestrin
44
Overfitting
Causes of overfitting Training set is too small Large number of features
Big problem in machine learning One solution: Validation set
Overfitting
0.6
Model complexity (e.g., number of nodes in decision tree)
ML as Optimization
Specify Preference Bias aka “Loss Function”
Be the first to leave a comment!!

12/1000 maximum characters.

Broadcast this publication

You may also like