Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade TF.Net version from 0.10.10 to 0.11.3 #4205

Merged
merged 16 commits into from
Sep 17, 2019

Conversation

ashbhandare
Copy link
Contributor

Fixes #4204

Includes a few API changes necessary to upgrade TF.Net version.
Verified that ML.Samples and Tests run.

@ashbhandare ashbhandare requested a review from a team as a code owner September 11, 2019 21:43
@dnfclas
Copy link

dnfclas commented Sep 11, 2019

CLA assistant check
All CLA requirements met. #Resolved

@codemzs
Copy link
Member

codemzs commented Sep 11, 2019

Why are the below files added? They should not be added, please remove them.

image
#Resolved

@@ -34,11 +34,13 @@ public static void Example()
var idv = mlContext.Data.LoadFromEnumerable(data);

// Create a ML pipeline.

var pipeline = mlContext.Model.LoadTensorFlowModel(modelLocation)
Copy link
Member

@codemzs codemzs Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove this new line. #Resolved

var pipeline = mlContext.Model.LoadTensorFlowModel(modelLocation)
.ScoreTensorFlowModel(
new[] { nameof(OutputScores.output) },
new[] { nameof(TensorData.input) }, addBatchDimensionInput: true);



Copy link
Member

@codemzs codemzs Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove this line and please revert this file. #Resolved

@@ -245,13 +245,14 @@ private void CheckTrainingParameters(DnnRetrainEstimator.Options options)

tfInputShape = new TensorShape(colTypeDims);
}
if (tfInputShape.NDim != -1)
if (tfInputShape.ndim != -1)
Copy link
Member

@codemzs codemzs Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tfInputShape.ndim [](start = 16, length = 17)

cache this value. #Resolved

loss = tensor.Length > 0 && tensor[0] != IntPtr.Zero ? (float)tensor[0].Data<float>()[0] : 0.0f;
metric = tensor.Length > 1 && tensor[1] != IntPtr.Zero ? (float)tensor[1].Data<float>()[0] : 0.0f;
var b = tensor.Length > 2 && tensor[2] != IntPtr.Zero ? (float[])tensor[2].Data<float>() : null;
//var buffer = tensor[0].Data();
Copy link
Member

@codemzs codemzs Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

//var buffer = tensor[0].Data(); [](start = 12, length = 32)

delete. #Resolved

var newShape = new int[tfInputShape.ndim];
var dims = tfInputShape.dims;
newShape[0] = dims[0] == 0 || dims[0] == -1 ? batchSize : dims[0];
//var v = tfInputShape[0].Equals(0);
Copy link
Contributor

@bpstark bpstark Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should remove unused code rather than commenting it out. #Resolved

loss = tensor.Length > 0 && tensor[0] != IntPtr.Zero ? (float)tensor[0].Data<float>()[0] : 0.0f;
metric = tensor.Length > 1 && tensor[1] != IntPtr.Zero ? (float)tensor[1].Data<float>()[0] : 0.0f;
var b = tensor.Length > 2 && tensor[2] != IntPtr.Zero ? (float[])tensor[2].Data<float>() : null;
//var buffer = tensor[0].Data();
Copy link
Contributor

@bpstark bpstark Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here #Resolved


tf.train.import_meta_graph(path);

return graph;
Copy link
Member

@codemzs codemzs Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove new lines. #Resolved

Copy link
Member

@eerhardt eerhardt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove the CMake output files. And ensure this doesn't regress performance with the calls to ToArray.

@@ -34,11 +34,13 @@ public static void Example()
var idv = mlContext.Data.LoadFromEnumerable(data);

// Create a ML pipeline.

Copy link
Member

@eerhardt eerhardt Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure if these changes are intended. Can they be reverted? #Resolved

var newShape = new int[tfInputShape.ndim];
var dims = tfInputShape.dims;
newShape[0] = dims[0] == 0 || dims[0] == -1 ? batchSize : dims[0];
//var v = tfInputShape[0].Equals(0);
Copy link
Member

@eerhardt eerhardt Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this comment useful? #Resolved

loss = tensor.Length > 0 && tensor[0] != IntPtr.Zero ? (float)tensor[0].Data<float>()[0] : 0.0f;
metric = tensor.Length > 1 && tensor[1] != IntPtr.Zero ? (float)tensor[1].Data<float>()[0] : 0.0f;
var b = tensor.Length > 2 && tensor[2] != IntPtr.Zero ? (float[])tensor[2].Data<float>() : null;
//var buffer = tensor[0].Data();
Copy link
Member

@eerhardt eerhardt Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this comment useful? #Resolved

src/Microsoft.ML.Dnn/DnnRetrainTransform.cs Show resolved Hide resolved
var l = new int[originalShape.ndim];
for (int ishape = 0; ishape < originalShape.ndim; ishape++)
{
var dims = originalShape.dims;
Copy link
Member

@eerhardt eerhardt Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this line can be lifted outside of the for loop. No reason to do it every time through the loop. #Resolved

@@ -0,0 +1 @@
{"requests":[{"kind":"cache","version":2},{"kind":"cmakeFiles","version":1},{"kind":"codemodel","version":2}]}
Copy link
Member

@eerhardt eerhardt Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I assume all changes in src\Native are not intended. Please revert. #Resolved

@@ -1023,7 +1018,7 @@ internal sealed class Options : TransformInputBase
/// <summary>
/// Location of the TensorFlow model.
/// </summary>
[Argument(ArgumentType.Required, HelpText = "TensorFlow model used by the transform. Please see https://www.tensorflow.org/mobile/prepare_models for more details.", SortOrder = 0)]
[Argument(ArgumentType.Required, HelpText = "TensorFlow model used by the transform. Please see https://www.tf.org/mobile/prepare_models for more details.", SortOrder = 0)]
Copy link
Member

@eerhardt eerhardt Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't access this new URL. Is it correct? #Resolved

@@ -16,7 +16,7 @@
<PackageReference Include="SciSharp.TensorFlow.Redist" Version="1.14.0" />
<PackageReference Include="System.IO.FileSystem.AccessControl" Version="$(SystemIOFileSystemAccessControl)" />
<PackageReference Include="System.Security.Principal.Windows" Version="$(SystemSecurityPrincipalWindows)" />
<PackageReference Include="TensorFlow.NET" Version="0.10.10" />
<PackageReference Include="TensorFlow.NET" Version="0.11.3" />
Copy link
Member

@eerhardt eerhardt Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't forget to also update the file in the pkg. Usually we put all the dependency versions in a single file - https://github.com/dotnet/machinelearning/blob/master/build/Dependencies.props. That way they only need to be updated in a single place. #Resolved

@@ -428,7 +428,7 @@ private protected override void SaveModel(ModelSaveContext ctx)
var buffer = Session.graph.ToGraphDef(status);
ctx.SaveBinaryStream("TFModel", w =>
{
w.WriteByteArray(buffer.Data);
w.WriteByteArray(buffer.MemoryBlock.ToArray());
Copy link
Member

@eerhardt eerhardt Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment about ToArray

I'll stop commenting on them, and assume you will test the perf before and after these changes. And fix any perf degradations. #Resolved

@@ -530,14 +526,14 @@ private void VariableSummaries(RefVariable var)
private (Operation, Tensor, Tensor, Tensor) AddFinalRetrainOps(int classCount, string labelColumn,
string scoreColumnName, float learningRate, Tensor bottleneckTensor, bool isTraining)
{
var (batch_size, bottleneck_tensor_size) = (bottleneckTensor.TensorShape.Dimensions[0], bottleneckTensor.TensorShape.Dimensions[1]);
var (batch_size, bottleneck_tensor_size) = (bottleneckTensor.TensorShape.dims[0], bottleneckTensor.TensorShape.dims[1]);
Copy link
Member

@codemzs codemzs Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dims [](start = 85, length = 4)

cache #Resolved

@@ -757,7 +752,7 @@ private protected override void SaveModel(ModelSaveContext ctx)
var buffer = _session.graph.ToGraphDef(status);
ctx.SaveBinaryStream("TFModel", w =>
{
w.WriteByteArray(buffer.Data);
w.WriteByteArray(buffer.MemoryBlock.ToArray());
Copy link
Member

@codemzs codemzs Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

w.WriteByteArray(buffer.MemoryBlock.ToArray()); [](start = 16, length = 47)

can you please do a diff between the graph with your change and master and see if there are any changes? I know the graph size increased and I want to know why and by how much. #Resolved

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The size of the files are exactly the same between the master and my change, but doing a diff shows that the files differ. Are the trained values expected to be exactly the same?


In reply to: 323478331 [](ancestors = 323478331)

@@ -380,7 +380,7 @@ internal static (TF_DataType[] tfOutputTypes, DataViewType[] outputTypes, (Opera
// i.e. the first dimension (if unknown) is assumed to be batch dimension.
// If there are other dimension that are unknown the transformer will return a variable length vector.
// This is the work around in absence of reshape transformer.
int[] dims = shape.NDim > 0 ? shape.Dimensions.Skip(shape[0] == -1 ? 1 : 0).ToArray() : new[] { 0 };
int[] dims = shape.ndim > 0 ? shape.dims.Skip(shape.dims[0] == -1 ? 1 : 0).ToArray() : new[] { 0 };
Copy link
Member

@codemzs codemzs Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dims [](start = 52, length = 4)

cache #Resolved

for (int ishape = 0; ishape < originalShape.NDim; ishape++)
l[ishape] = originalShape[ishape] == -1 ? (int)d : originalShape[ishape];
var l = new int[originalShape.ndim];
for (int ishape = 0; ishape < originalShape.ndim; ishape++)
Copy link
Member

@codemzs codemzs Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ndim [](start = 68, length = 4)

cache #Resolved

l[ishape] = originalShape[ishape] == -1 ? (int)d : originalShape[ishape];
var l = new int[originalShape.ndim];
for (int ishape = 0; ishape < originalShape.ndim; ishape++)
l[ishape] = originalShape.dims[ishape] == -1 ? (int)d : originalShape.dims[ishape];
Copy link
Member

@codemzs codemzs Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dims [](start = 98, length = 4)

cache #Resolved

@@ -635,6 +635,7 @@ private void UpdateCacheIfNeeded(long position, ITensorValueGetter[] srcTensorGe
{
if (outputCache.Position != position)
{
_parent.Session.graph.as_default();
Copy link
Member

@codemzs codemzs Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_parent.Session.graph.as_default(); [](start = 20, length = 35)

This line is setting the graph as the default graph every time the cursor moves to the new row. We just need to do this once in the thread of the cursor. Please fix, this is call is an interop call and can reduce performance at scoring. #Resolved

Copy link
Member

@codemzs codemzs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🕐

@@ -38,7 +38,7 @@ public static void Example()
.ScoreTensorFlowModel(
new[] { nameof(OutputScores.output) },
new[] { nameof(TensorData.input) }, addBatchDimensionInput: true);

Copy link
Member

@safern safern Sep 12, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: tabs in empty line. #Resolved

var b = tensor.Length > 2 && tensor[2] != IntPtr.Zero ? (float[])tensor[2].Data<float>() : null;
loss = tensor.Length > 0 && tensor[0] != IntPtr.Zero ? (float)tensor[0].ToArray<float>()[0] : 0.0f;
metric = tensor.Length > 1 && tensor[1] != IntPtr.Zero ? (float)tensor[1].ToArray<float>()[0] : 0.0f;
var b = tensor.Length > 2 && tensor[2] != IntPtr.Zero ? (float[])tensor[2].ToArray<float>() : null;
Copy link
Member

@eerhardt eerhardt Sep 13, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see b being used anywhere. Can it be removed? #Resolved

@@ -38,7 +38,7 @@ public static void Example()
.ScoreTensorFlowModel(
new[] { nameof(OutputScores.output) },
new[] { nameof(TensorData.input) }, addBatchDimensionInput: true);

Copy link
Member

@eerhardt eerhardt Sep 13, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you revert this file? Changing files unnecessarily bloats the history. #Resolved

@@ -857,7 +861,7 @@ protected override Delegate MakeGetter(DataViewRow input, int iinfo, Func<int, b
return Utils.MarshalInvoke(MakeGetter<int>, type, input, iinfo, srcTensorGetters, activeOutputColNames, outputCache);
}

private Delegate MakeGetter<T>(DataViewRow input, int iinfo, ITensorValueGetter[] srcTensorGetters, string[] activeOutputColNames, OutputCache outputCache)
private Delegate MakeGetter<T>(DataViewRow input, int iinfo, ITensorValueGetter[] srcTensorGetters, string[] activeOutputColNames, OutputCache outputCache) where T: unmanaged
Copy link
Member

@eerhardt eerhardt Sep 13, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where T: unmanaged

Can this ever be used with a string type? If so, this is going to fail at runtime since we use reflection to call this method. #Resolved

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have made a separate function for when T is string to avoid this failure.


In reply to: 324317321 [](ancestors = 324317321)

@@ -10,7 +10,7 @@
<ItemGroup>
<PackageReference Include="System.IO.FileSystem.AccessControl" Version="$(SystemIOFileSystemAccessControl)" />
<PackageReference Include="System.Security.Principal.Windows" Version="$(SystemSecurityPrincipalWindows)" />
<PackageReference Include="TensorFlow.NET" Version="0.10.10" />
<PackageReference Include="TensorFlow.NET" Version="0.11.3" />
Copy link
Member

@eerhardt eerhardt Sep 13, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line can actually be removed. Since this project references Microsoft.ML.Dnn, and Microsoft.ML.Dnn references TF.NET, we don't need to say that this project references TF.NET. That way one duplicated version number can be removed. #Resolved

@eerhardt eerhardt dismissed their stale review September 13, 2019 18:46

the major concerns I had are now addressed

@eerhardt
Copy link
Member

eerhardt commented Sep 13, 2019

You still need to update

<PackageReference Include="TensorFlow.NET" Version="0.10.10" />
#Resolved

@@ -18,6 +18,7 @@
using NumSharp;
using Tensorflow;
using static Microsoft.ML.Transforms.Dnn.DnnUtils;
Copy link

@yaeldekel yaeldekel Sep 15, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Microsoft.ML.Transforms.Dnn.DnnUtils [](start = 13, length = 36)

This should be ordered alphabetically. #Resolved

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

VS shows a warning if I reorder. I am guessing the 'using static' derivatives should appear at the end, and they are already ordered alphabetically.


In reply to: 324450008 [](ancestors = 324450008)

var originalShapeNdim = originalShape.ndim;
var l = new int[originalShapeNdim];
for (int ishape = 0; ishape < originalShapeNdim; ishape++)
l[ishape] = originalShapeDims[ishape] == -1 ? (int)d : originalShapeDims[ishape];
Copy link

@yaeldekel yaeldekel Sep 15, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(int)d [](start = 74, length = 6)

Should this cast also be done outside the loop? #Resolved

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not an addition in this PR. If required, I can do this change


In reply to: 324450404 [](ancestors = 324450404)

@@ -912,6 +915,8 @@ private void UpdateCacheIfNeeded(long position, ITensorValueGetter[] srcTensorGe
{
if (outputCache.Position != position)
{
if (_parent.Graph.graph_key != tf.get_default_graph().graph_key)
_parent._session.graph.as_default();
Copy link

@yaeldekel yaeldekel Sep 15, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do these lines do?
#Resolved

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if the current graph is not the tf default graph, this line sets it as the default graph.


In reply to: 324450509 [](ancestors = 324450509)

});
var graph = new Graph();
graph = graph.as_default();
tf.train.import_meta_graph(path);
Copy link

@yaeldekel yaeldekel Sep 15, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tf.train.import_meta_graph(path); [](start = 12, length = 33)

It seems like there is no connection between graph and tf, does this line do anything to graph? #Resolved

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This creates a new graph and the 'as_default()' sets this as the tf default graph https://github.com/SciSharp/TensorFlow.NET/blob/41432600c8263cf972b13930c31479ffb412fa65/src/TensorFlowNET.Core/Graphs/Graph.cs#L134. The 'graph' is where the meta graph is loaded and returned.


In reply to: 324450631 [](ancestors = 324450631)

@@ -170,7 +170,7 @@ private void CheckTrainingParameters(ImageClassificationEstimator.Options option
Host.CheckNonWhiteSpace(options.LabelColumn, nameof(options.LabelColumn));
Host.CheckNonWhiteSpace(options.TensorFlowLabel, nameof(options.TensorFlowLabel));

if (_session.graph.OperationByName(options.TensorFlowLabel) == null)
if (_session.graph.OperationByName(_labelTensor.name.Split(':')[0]) == null)
Copy link

@yaeldekel yaeldekel Sep 15, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_labelTensor.name.Split(':')[0] [](start = 47, length = 31)

If this is the name of the operation you are searching for in the graph, then the error message should mention the correct name. In this case, the options argument is not needed here. #Resolved

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the options argument is not used anymore. The _labelTensor has the correct name that the error message would need


In reply to: 324450724 [](ancestors = 324450724)

tf_with(evalGraph.as_default(), graph =>
{
var (_, _, groundTruthInput, finalTensor) = AddFinalRetrainOps(classCount, options.LabelColumn,
evalGraph.as_default();
Copy link

@yaeldekel yaeldekel Sep 15, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

evalGraph.as_default(); [](start = 12, length = 23)

Does this mutate evalGraph? In DnnUtils.cs line 96 you wrote graph = graph.as_default();. #Resolved

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In line 842 in this file there is also a call to as_default() without assignment.


In reply to: 324450796 [](ancestors = 324450796)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This sets the tensorflow default graph to the evalGraph. This does not mutate the evalGraph.


In reply to: 324450796 [](ancestors = 324450796)

@@ -235,7 +236,8 @@ private void CheckTrainingParameters(DnnRetrainEstimator.Options options)
inputTensor.InputType(index);
var tfInputShape = ((Tensor)inputTensor).TensorShape;

if (isInputVector && (tfInputShape == null || (tfInputShape.NDim == 0)))
var numInputDims = tfInputShape.ndim;
Copy link
Member

@codemzs codemzs Sep 16, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

var numInputDims = tfInputShape.ndim; [](start = 12, length = 37)

It seems tfInputShape can be null. We should set numInputDims to -1 if tfInputShape is null otherwise assign numInputDims = tfInputShape.ndim; #Closed

@@ -843,6 +839,7 @@ public void UpdateCacheIfNeeded()
protected override Delegate MakeGetter(DataViewRow input, int iinfo, Func<int, bool> activeOutput, out Action disposer)
{
disposer = null;
_parent._session.graph.as_default();
Copy link
Member

@codemzs codemzs Sep 16, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_parent._session.graph.as_default(); [](start = 16, length = 36)

not needed. #Resolved

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Required for the Microsoft.ML.Scenarios.TensorFlowScenariosTests.TensorFlowImageClassification


In reply to: 324890773 [](ancestors = 324890773)

@@ -580,7 +585,7 @@ protected override Delegate MakeGetter(DataViewRow input, int iinfo, Func<int, b
return Utils.MarshalInvoke(MakeGetter<int>, type, input, iinfo, srcTensorGetters, activeOutputColNames, outputCache);
}

private Delegate MakeGetter<T>(DataViewRow input, int iinfo, ITensorValueGetter[] srcTensorGetters, string[] activeOutputColNames, OutputCache outputCache)
private Delegate MakeGetter<T>(DataViewRow input, int iinfo, ITensorValueGetter[] srcTensorGetters, string[] activeOutputColNames, OutputCache outputCache) where T:unmanaged
{
Copy link
Member

@codemzs codemzs Sep 16, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

refactor code for unmanged as done in ImageClassificationTranform. #Closed

Copy link
Contributor Author

@ashbhandare ashbhandare Sep 16, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did the refactor and looks like one of the tests was hitting this code path for string, as the test failed due to an error in the refactor. I corrected it and it passes now, but that implies that the original MakeGetter was not erroring out for the type string even after 'where T: unmanaged' was specified. Do we still want to refactor the code? The test that I am referring to is Microsoft.ML.Scenarios.TensorFlowScenariosTests.TensorFlowStringTest


In reply to: 324894953 [](ancestors = 324894953)

@@ -843,6 +839,7 @@ public void UpdateCacheIfNeeded()
protected override Delegate MakeGetter(DataViewRow input, int iinfo, Func<int, bool> activeOutput, out Action disposer)
{
disposer = null;
_parent._session.graph.as_default();
Copy link
Member

@codemzs codemzs Sep 17, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_parent._session.graph.as_default(); [](start = 16, length = 36)

is this needed? #Resolved

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, test Microsoft.ML.Scenarios.TensorFlowScenariosTests.TensorFlowImageClassification is failing without this


In reply to: 324938993 [](ancestors = 324938993)

Copy link
Member

@codemzs codemzs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:shipit:

@ashbhandare ashbhandare merged commit 862ae84 into dotnet:master Sep 17, 2019
@ghost ghost locked as resolved and limited conversation to collaborators Mar 20, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Need to move from TF.NET version 0.10.10 to 0.11.3
7 participants