annotate linear_regression.py @ 132:f6505ec32dc3

Updated documentation slightly
author Joseph Turian <turian@gmail.com>
date Thu, 08 May 2008 00:54:14 -0400
parents d0a1bd0378c6
children c9a89be5cb0a
rev   line source
132
f6505ec32dc3 Updated documentation slightly
Joseph Turian <turian@gmail.com>
parents: 118
diff changeset
1 """
f6505ec32dc3 Updated documentation slightly
Joseph Turian <turian@gmail.com>
parents: 118
diff changeset
2 Implementation of linear regression, with or without L2 regularization.
f6505ec32dc3 Updated documentation slightly
Joseph Turian <turian@gmail.com>
parents: 118
diff changeset
3 This is one of the simplest example of L{learner}, and illustrates
f6505ec32dc3 Updated documentation slightly
Joseph Turian <turian@gmail.com>
parents: 118
diff changeset
4 the use of theano.
f6505ec32dc3 Updated documentation slightly
Joseph Turian <turian@gmail.com>
parents: 118
diff changeset
5 """
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
6
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
7 from learner import *
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
8 from theano import tensor as t
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
9 from theano.scalar import as_scalar
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
10
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
11 class LinearRegression(MinibatchUpdatesTLearner):
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
12 """
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
13 Implement linear regression, with or without L2 regularization
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
14 (the former is called Ridge Regression and the latter Ordinary Least Squares).
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
15
92
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
16 The predictor parameters are obtained analytically from the training set.
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
17 Training can proceed sequentially (with multiple calls to update with
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
18 different disjoint subsets of the training sets). After each call to
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
19 update the predictor is ready to be used (and optimized for the union
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
20 of all the training sets passed to update since construction or since
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
21 the last call to forget).
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
22
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
23 For each (input[t],output[t]) pair in a minibatch,::
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
24
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
25 output_t = b + W * input_t
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
26
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
27 where b and W are obtained by minimizing::
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
28
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
29 L2_regularizer sum_{ij} W_{ij}^2 + sum_t ||output_t - target_t||^2
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
30
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
31 Let X be the whole training set inputs matrix (one input example per row),
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
32 with the first column full of 1's, and Let Y the whole training set
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
33 targets matrix (one example's target vector per row).
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
34 Let theta = the matrix with b in its first column and W in the others,
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
35 then each theta[:,i] is the solution of the linear system::
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
36
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
37 XtX * theta[:,i] = XtY[:,i]
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
38
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
39 where XtX is a (n_inputs+1)x(n_inputs+1) matrix containing X'*X
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
40 plus L2_regularizer on the diagonal except at (0,0),
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
41 and XtY is a (n_inputs+1)*n_outputs matrix containing X'*Y.
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
42
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
43 The fields and attributes expected and produced by use and update are the following:
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
44
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
45 - Input and output fields (example-wise quantities):
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
46
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
47 - 'input' (always expected by use and update as an input_dataset field)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
48 - 'target' (optionally expected by use and update as an input_dataset field)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
49 - 'output' (optionally produced by use as an output dataset field)
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
50 - 'squared_error' (optionally produced by use as an output dataset field, needs 'target') = example-wise squared error
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
51
92
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
52 - optional attributes (optionally expected as input_dataset attributes)
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
53 (warning, this may be dangerous, the 'use' method will use those provided in the
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
54 input_dataset rather than those learned during 'update'; currently no support
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
55 for providing these to update):
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
56
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
57 - 'L2_regularizer'
92
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
58 - 'b'
110
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
59 - 'W'
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
60 - 'parameters' = [b, W]
110
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
61 - 'regularization_term'
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
62 - 'XtX'
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
63 - 'XtY'
107
c4916445e025 Comments from Pascal V.
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 92
diff changeset
64
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
65 """
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
66
92
c4726e19b8ec Finished first draft of TLearner
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 78
diff changeset
67 def attributeNames(self):
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
68 return ["L2_regularizer","parameters","b","W","regularization_term","XtX","XtY"]
110
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
69
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
70 def useInputAttributes(self):
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
71 return ["b","W"]
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
72
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
73 def useOutputAttributes(self):
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
74 return []
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
75
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
76 def updateInputAttributes(self):
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
77 return ["L2_regularizer","XtX","XtY"]
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
78
110
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
79 def updateMinibatchInputFields(self):
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
80 return ["input","target"]
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
81
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
82 def updateMinibatchInputAttributes(self):
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
83 return ["XtX","XtY"]
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
84
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
85 def updateMinibatchOutputAttributes(self):
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
86 return ["new_XtX","new_XtY"]
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
87
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
88 def updateEndInputAttributes(self):
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
89 return ["theta","XtX","XtY"]
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
90
110
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
91 def updateEndOutputAttributes(self):
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
92 return ["new_theta","b","W","regularization_term"] # CHECK: WILL b AND W CONTAIN OLD OR NEW THETA? @todo i.e. order of computation = ?
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
93
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
94 def parameterAttributes(self):
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
95 return ["b","W"]
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
96
78
3499918faa9d In the middle of designing TLearner
bengioy@bengiomac.local
parents: 77
diff changeset
97 def defaultOutputFields(self, input_fields):
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
98 output_fields = ["output"]
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
99 if "target" in input_fields:
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
100 output_fields.append("squared_error")
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
101 return output_fields
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
102
110
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
103 def __init__(self):
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
104 self._input = t.matrix('input') # n_examples x n_inputs
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
105 self._target = t.matrix('target') # n_examples x n_outputs
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
106 self._L2_regularizer = as_scalar(0.,'L2_regularizer')
110
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
107 self._theta = t.matrix('theta')
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
108 self._W = self._theta[:,1:]
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
109 self._b = self._theta[:,0]
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
110 self._XtX = t.matrix('XtX')
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
111 self._XtY = t.matrix('XtY')
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
112 self._extended_input = t.prepend_one_to_each_row(self._input)
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
113 self._output = t.dot(self._input,self._W.T) + self._b # (n_examples , n_outputs) matrix
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
114 self._squared_error = t.sum_within_rows(t.sqr(self._output-self._target)) # (n_examples ) vector
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
115 self._regularizer = self._L2_regularizer * t.dot(self._W,self._W)
110
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
116 self._new_XtX = add_inplace(self._XtX,t.dot(self._extended_input.T,self._extended_input))
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
117 self._new_XtY = add_inplace(self._XtY,t.dot(self._extended_input.T,self._target))
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
118 self._new_theta = t.solve_inplace(self._theta,self._XtX,self._XtY)
77
1e2bb5bad636 toying with different ways to implement learners
bengioy@bengiomac.local
parents: 75
diff changeset
119
118
d0a1bd0378c6 Finished draft of OneHiddenLayerNNetClassifier to debut learner.py
Yoshua Bengio <bengioy@iro.umontreal.ca>
parents: 111
diff changeset
120 MinibatchUpdatesTLearner.__init__(self)
110
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
121
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
122 def allocate(self,minibatch):
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
123 minibatch_n_inputs = minibatch["input"].shape[1]
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
124 minibatch_n_outputs = minibatch["target"].shape[1]
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
125 if not self._n_inputs:
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
126 self._n_inputs = minibatch_n_inputs
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
127 self._n_outputs = minibatch_n_outputs
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
128 self.XtX = numpy.zeros((1+self._n_inputs,1+self._n_inputs))
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
129 self.XtY = numpy.zeros((1+self._n_inputs,self._n_outputs))
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
130 self.theta = numpy.zeros((self._n_outputs,1+self._n_inputs))
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
131 self.forget()
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
132 elif self._n_inputs!=minibatch_n_inputs or self._n_outputs!=minibatch_n_outputs:
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
133 # if the input or target changes dimension on the fly, we resize and forget everything
110
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
134 self.forget()
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
135
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
136 def forget(self):
110
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
137 if self._n_inputs and self._n_outputs:
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
138 self.XtX.resize((1+self.n_inputs,1+self.n_inputs))
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
139 self.XtY.resize((1+self.n_inputs,self.n_outputs))
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
140 self.XtX.data[:,:]=0
8fa1ef2411a0 Worked on OneShotTLearner and implementation of LinearRegression
bengioy@bengiomac.local
parents: 107
diff changeset
141 self.XtY.data[:,:]=0
111
88257dfedf8c Added another work in progress, for mlp's
bengioy@bengiomac.local
parents: 110
diff changeset
142 numpy.diag(self.XtX.data)[1:]=self.L2_regularizer
75
90e4c0784d6e Added draft of LinearRegression learner
bengioy@bengiomac.local
parents:
diff changeset
143