Giter VIP home page Giter VIP logo

protobuf-net-data's People

Contributors

digitaldan1 avatar dotarj avatar gitter-badger avatar miogreen avatar rdingwall avatar robertpi avatar thomasjoscht avatar tw-aleksandergrechishkin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

protobuf-net-data's Issues

How can I stop/cancel my server work?

Thank you.
This source has been a great help in my project.
But, I have found a small problem.

  1. Environment : 500,000 rows data + WCF/IIS/WAS/TCP + Protobuf-net + Protobuf-net.data.
  2. WCF Server : ExecuteReader() call to the 500,000 rows of data queries
  3. WCF Client : Reader.Read() 1,000 rows read. and call reader.Close() or stream.Close() or reader.Dispose() or stream.Dispose()
  4. But Server reader.Read() is still under work.
  5. WCF Client : Just wait for the end.

How can I stop my server work?

Examples of streaming across the network?

Do you have any examples of streaming a large IDataReader across the network? .net core has moved away from WCF. Is there a way for this to work with Grpc?

Thanks.

how to return a stream, wrapped in a message contract in WCF transferMode="Streamed" and ProtoDataReader

Hi.
I have been working on a WCF project.
Server:. NET Framework 4.0, IIS 7.5, WAS, netTcpBinding, transferMode = Streamed
Client: Winform 4.0

I had to use ProtoDataReader. More data than can be returned quickly. I am very happy.

However, DataReader Stream returns do not receive any additional information.
I want to know. how to return a stream, wrapped in a message contract.

// additional information + DataReader Stream ?
public class ResultReturn
{
public string ServiceInfo {get; set;}
public string OrderID {get; set;}
public string OrderDetail {get; set;}

  ... DataReader Stream ? ...

}

how to Work Client and Server Operations(All Codes)

thanks.

Unless computed columns are included, serialization does not work in Mono

ProtoDataColumnFactory.GetColumns ignores schema rows when Expression is not DBNull unless ProtoDataWriterOptions has IncludeComputedColumns turned on. In Mono DataTableReader.GetSchemaTable always has an Expression column with a value of string.Empty. Need to check for empty string in addition to DBNull.

ArgumentException in ProtoDataWriter

Hi Richard,

I started to try out the ProtoBuf-Net and this extension, but I haven't got much luck (with the latest as of date): when I'm trying to serialize an SqlDataReader to a MemoryStream, I'm getting an ArgumentException which says "Column 'Expression' does not belong to table SchemaTable."

I think it's related to the fix of Issue#11, because it happens on that code in the ProtoDataWriter class Serialize method. The problem is that the code tries to test whether the schematable's "Expression" column is DBNull or not, but there is no such column in the reader's schematable so it will cause an exception. The reader is a normal SqlDataReader which I'm getting with a SqlCommand's ExecuteReader() method.

Thanks,
Laszlo

Table Names

It appears that the table names are not being serialized. When sending a large dataset, it's nice to have that additional piece of data.

"The column mapping from SourceColumn 'FullName' failed because the DataColumn 'FullName' is a computed column."

Reported via email from Daniel-Lucian Corsei:

Hello,

I think I found an issue in Protobuf-net-data project.
I tried to signup on GitHub to add this issue, but i receive an error, so I decide to send you a mail.

I have attached a sample project where you can see the error about I am talking.
The problem is that serialization of a dataset does not skip the computed columns, and when I try to load the dataset, it gave me the error "The column mapping from SourceColumn 'FullName' failed because the DataColumn 'FullName' is a computed column."

Possible bug in RecordReader

Hi, we have upgraded this lib in our project from 2.1.1 to 3.0.0

We have pgsql database that could be exported to XmlDataset and then this is serialized to file with protobuf 2.4 and this project.
XmlDataset contains multiple DataTables

Serialization works fine and for the concrete DataTable is saved all 15 columns. This DataTable isn't first in processing. Before it is table with 10 columns.

During next start of application is loaded serialized file and it is deserialized. Here we come to
IndexBoundOfException in ProtoDataReader#GetValues(object[] values) at line 205
reason is that values has size 15. this is correct because table should have 15 columns
=> valuesCount has value 15 too.
BUT context.Buffers has size only 10.

So I tried debug and tried to find where is problem.
I didn't fully get the call tree.
What I found is code in RecordReader#ReadRecord(ProtoReaderContext context)
in this method is context.Buffers created/initialized and if isn't null then it is only cleared.

First DataTable that is deserialized have 10 columns. Is it possible when next DataTable is deserialized that context.Buffers isn't reset to null so then buffers are only cleared and this lead to this error?

During ReadRecordValues it comes for columns 11-15 into the if at line 52 and values are skipped.

This is way how file is loaded and created:

public static XMLDataSet LoadXML() {
            database = new XMLDataSet();
            using (var reader = DataSerializer.Deserialize(new GZipStream(new FileStream(soubor, FileMode.Open), CompressionMode.Decompress))) {
                var tables = new List<DataTable>();
                foreach (DataTable table in database.Tables) {
                    tables.Add(table);
                }
                database.Load(reader, LoadOption.OverwriteChanges, tables.ToArray());
            }
            return database;
        }

       public static void SaveXML(XMLDataSet database) {
            using (GZipStream gzip = new GZipStream(new FileStream(soubor, FileMode.Create), CompressionMode.Compress)) {
                database.SchemaSerializationMode = SchemaSerializationMode.ExcludeSchema;
                database.RemotingFormat = SerializationFormat.Binary;
                DataSerializer.Serialize(gzip, database);
                gzip.Flush();
            }
        }

Incompatibility with protobuf-net v3

Hi,

We have recently updated to protobuf-net v3.0.29 and now get the following exception

"System.MissingMethodException: Method not found: 'Void ProtoBuf.ProtoWriter..ctor(System.IO.Stream, ProtoBuf.Meta.TypeModel, ProtoBuf.SerializationContext)'.

Server stack trace: 
   at ProtoBuf.Data.ProtoDataWriter.Serialize(Stream stream, IDataReader reader, ProtoDataWriterOptions options)
   at ProtoBuf.Data.DataSerializerEngine.Serialize(Stream stream, DataSet dataSet, ProtoDataWriterOptions options)
   at Brady.Etrm.Base.Infrastructure.Remoting.DataSetBinaryRemotingFormatSerializationSurrogate.GetObjectData(Object obj, SerializationInfo info, StreamingContext context) in C:\Development\etrm\Source\ECS\Libraries\Viz.Base\Infrastructure\Remoting\DataSetBinaryRemotingFormatSerializationSurrogate.cs:line 28
   at Brady.Etrm.Base.Infrastructure.Remoting.RootObjectProtobufSerializationWrapper.GetObjectData(SerializationInfo info, StreamingContext context) in C:\Development\etrm\Source\ECS\Libraries\Viz.Base\Infrastructure\Remoting\RootObjectProtobufSerializationWrapper.cs:li
ne 34
   at System.Runtime.Serialization.Formatters.Binary.WriteObjectInfo.InitSerialize(Object obj, ISurrogateSelector surrogateSelector, StreamingContext context, SerObjectInfoInit serObjectInfoInit, IFormatterConverter converter, ObjectWriter objectWriter, SerializationBinder binder)
   at System.Runtime.Serialization.Formatters.Binary.WriteObjectInfo.Serialize(Object obj, ISurrogateSelector surrogateSelector, StreamingContext context, SerObjectInfoInit serObjectInfoInit, IFormatterConverter converter, ObjectWriter objectWriter, SerializationBinder binder)
   at System.Runtime.Serialization.Formatters.Binary.ObjectWriter.Serialize(Object graph, Header[] inHeaders, __BinaryWriter serWriter, Boolean fCheck)
   at System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Serialize(Stream serializationStream, Object graph, Header[] headers, Boolean fCheck)
   at System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Serialize(Stream serializationStream, Object graph, Header[] headers
)
   at Brady.Etrm.Infrastructure.Remoting.BinaryWithProtobufSerializationHelper.SerializeBinaryMessage(IMessage msg, Stream outputStream) in C:\Development\etrm\Source\ECS\Libraries\Viz.Base\Infrastructure\Remoting\BinaryWithProtobufSerializationHelper.cs:line 63
   at Brady.Etrm.Infrastructure.Remoting.BradyServerFormatterSink.SerializeResponse(IServerResponseChannelSinkStack sinkStack, IMessage msg, ITransportHeaders& headers, Stream& stream) in C:\Development\etrm\Source\ECS\Libraries\Viz.Base\Infrastructure\Remoting\BradyServerFormatterSink.cs:line 170
   at Brady.Etrm.Infrastructure.Remoting.BradyServerFormatterSink.ProcessMessage(IServerChannelSinkStack sinkStack, IMessage requestMsg, ITransportHeaders requestHeaders, Stream requestStream, IMessage& responseMsg, ITransportHeaders& responseHeaders, Stream& responseStream) in C:\Development\etrm\Source\ECS\Libraries\Viz.Base\Infrastructure\Remoting\BradyServerFormatterSink.cs:line 95

Exception rethrown at [0]:
   at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
   at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
   at Viz.Middleware.Extractor.VizPeer.IExtractorDataSetService.GetUserPermissionList(String user, String database)
   at Viz.EamSuite.DealManager.Database.DatabaseAccessPoint.GetUserPermissions() in C:\Development\etrm\Source\ECS\EFM\Viz.Efm.Presentation.Model\ViewModels\DealManager\DealWindow\Database\DatabaseAccessPoint.cs:line 854
   at Viz.EamSuite.DealManager.ViewPortfolioMonitor.RefreshPermissionList() in C:\Development\etrm\Source\ECS\EFM\Viz.Efm.Presentation.Windows\Views\DealManager\ContractManagement\Views\ViewPortfolioMonitor.cs:line 7840"

I see from protobuf v3 docs that:

ProtoReader/ProtoWriter must now be instantiated via ProtoReader.Create, not new (recent v2 releases have been advising this change for some time); most users will not be using these APIs.

Are there any plans to support v3 or can a new version be published to cap at v2 so that our nuget refs can detect and cap the protobuf-net at latest v2?

Thanks

Transform DataReader into another DataReader but retain streaming...

Apologies for the baffling title as it's not really an issue either; I'd better explain:

We are using your fantastic changes to stream data from a reader over WCF into a reader on the client side - works really well. Thanks.

I have a paradigm that I think could borrow from your work however.

We are using 'flat' table storage so we store a row for each element of a tables data - so for a table of 10 columns * 5 rows we actually store 50 rows, each row describing the row, column, data type and value.

A consumer will request for example 100 rows of actual data which actually equates to 1000 rows of data, however I would like to present this to the consumer as the original 100 rows x 10 columns they are expecting. Which requires me to return a different reader from the one I use to query the db / service.

AS your deserialisation populates a data reader - I wondered if you were able to point me in the right direction of populating a reader at run time from a secondary object (in your case a stream - in mine another reader).

Thanks Richard.

Alex

Port .net core?

Any plans on porting this to .net core? I'm happy to contribute if it's in the road map.

Deserialization error "Arithmetic operation resulted in an overflow"

Hi, I'm getting this error when doing this:

        DataTable dt = new DataTable();
        using (Stream ms = ToStream(theString))
            using (IDataReader reader = DataSerializer.Deserialize(ms))
        {
            dt.Load(reader);
        }


    static Stream ToStream(string str)
    {
        MemoryStream stream = new MemoryStream();
        StreamWriter writer = new StreamWriter(stream);
        writer.Write(str);
        writer.Flush();
        stream.Position = 0;
        return stream;
    }

System.OverflowException: Arithmetic operation resulted in an overflow.
at Int32 ProtoBuf.ProtoReader.TryReadUInt32VariantWithoutMoving(System.Boolean trimNegative, System.UInt32 value) in c:\Dev\protobuf-net\protobuf-net\ProtoReader.cs:line 104
at UInt32 ProtoBuf.ProtoReader.ReadUInt32Variant(System.Boolean trimNegative) in c:\Dev\protobuf-net\protobuf-net\ProtoReader.cs:line 151
at Int32 ProtoBuf.ProtoReader.ReadInt32() in c:\Dev\protobuf-net\protobuf-net\ProtoReader.cs:line 277
at Object ProtoBuf.Data.ProtoDataReader.b__0() in c:\TeamCity\buildAgent\work\859c78e227c61cb9\src\ProtoBuf.Data\ProtoDataReader.cs:line 422
at System.Void ProtoBuf.Data.ProtoDataReader.ReadCurrentRow() in c:\TeamCity\buildAgent\work\859c78e227c61cb9\src\ProtoBuf.Data\ProtoDataReader.cs:line 495
at Boolean ProtoBuf.Data.ProtoDataReader.Read() in c:\TeamCity\buildAgent\work\859c78e227c61cb9\src\ProtoBuf.Data\ProtoDataReader.cs:line 281
at Int32 System.Data.Common.DataAdapter.FillLoadDataRow(System.Data.ProviderBase.SchemaMapping mapping)
at Int32 System.Data.Common.DataAdapter.FillFromReader(System.Data.DataSet dataset, System.Data.DataTable datatable, System.String srcTable, System.Data.ProviderBase.DataReaderContainer dataReader, System.Int32 startRecord, System.Int32 maxRecords, System.Data.DataColumn parentChapterColumn, System.Object parentChapterValue)
at Int32 System.Data.Common.DataAdapter.Fill(System.Data.DataTable[] dataTables, System.Data.IDataReader dataReader, System.Int32 startRecord, System.Int32 maxRecords)
at Int32 System.Data.Common.LoadAdapter.FillFromReader(System.Data.DataTable[] dataTables, System.Data.IDataReader dataReader, System.Int32 startRecord, System.Int32 maxRecords)
at System.Void System.Data.DataTable.Load(System.Data.IDataReader reader, System.Data.LoadOption loadOption, System.Data.FillErrorEventHandler errorHandler)
at System.Void System.Data.DataTable.Load(System.Data.IDataReader reader)
at System.Void Master.SqsListener.WriteRecTable(System.String msg) in l:\Projects\Master\Master\SqsListener.cs:line 243
at System.Void Master.SqsListener.processMsg(System.Int32 i, Amazon.SQS.Model.Message msg) in l:\Projects\Master\Master\SqsListener.cs:line 192
at System.Void Master.SqsListener.RunListenDb() in l:\Projects\Master\Master\SqsListener.cs:line 117
at static System.Void System.Threading.ThreadHelper.ThreadStart_Context(System.Object state)
at static System.Void System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext executionContext, System.Threading.ContextCallback callback, System.Object state, System.Boolean preserveSyncCtx)
at static System.Void System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext executionContext, System.Threading.ContextCallback callback, System.Object state, System.Boolean preserveSyncCtx)
at static System.Void System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext executionContext, System.Threading.ContextCallback callback, System.Object state)
at System.Void System.Threading.ThreadHelper.ThreadStart()

any idea what the cause could be or where to start looking?

Serialize DataSet

Hello, you can deserialize a DataSet, but I can not find how to serialize a DataSet.

Greetings and good work.

Issue Serielizing DataTable with column of Type DateTimeOffset

I get an error that DateTimeOffset is not supported, I note from Protobuf-net that there is a similar problem that can be worked around with some custom code. Is there any similar way to achieve the same with protobuf-net-data?

My code is something like:

using (var stream = new MemoryStream())
{
DataSerializer.Serialize(stream, myDataTable);
stream.Seek(0, SeekOrigin.Begin);
compressObj = stream.ToArray();
}

Where myDataTable is contains a column of Type DateTimeOffset, I guess I can manually traverse my datatable and convert each cell to a datetime, this wouldn't be pretty inefficient though. Thanks

DeserializeDataSet with multiple tables

I'm hoping to use protobuf-net-data to replace the use of BinaryFormatter in a generic SQL caching solution. When retrieving from cache we don't know the number of tables in the DataSet. Could protobuf-net-data be extended to support this? It already supports an arbitrary number of columns and rows.

In the nunit test below, if I pass the right number of tables to the DeserializeDataSet call it will work. I would like to not specify any tables at all.

        [Test]
        public void MultipleDataSetTest()
        {
            // GIVEN: a DataSet with multiple tables
            DataSet expected = new DataSet();
            DataTable dt = expected.Tables.Add("Table1");
            dt.Columns.Add(new DataColumn("column1", typeof(int)));
            dt.Columns.Add(new DataColumn("column2", typeof(double)));
            dt.Rows.Add(1234, 100.10);

            DataTable dt2 = expected.Tables.Add("Table2");
            dt2.Columns.Add(new DataColumn("column1", typeof(string)));
            dt2.Columns.Add(new DataColumn("column2", typeof(int)));
            dt2.Rows.Add("serialization", 100);

            // WHEN: we serialize into cache
            var stream = new MemoryStream();
            DataSerializer.Serialize(stream, expected);
            stream.Position = 0;

            // THEN: we can deserialize the object with the same number of tables
            var actual = DataSerializer.DeserializeDataSet(stream, new List<string> { "Table1" });
            Assert.AreEqual(expected.Tables.Count, actual.Tables.Count);
            for (var table = 0; table < expected.Tables.Count; table++)
            {
                Assert.AreEqual(expected.Tables[table].Rows.Count, actual.Tables[table].Rows.Count);
                Assert.AreEqual(expected.Tables[table].Columns.Count, actual.Tables[table].Columns.Count);
            }
        }

Thanks

Fill the ringbuffer as data is being transmitted

In ProtoDataStream.cs, the Read method calls FillBuffer, which replenishes the circular buffer synchronously. In a WCF scenario with hundreds of concurrent users, I'd like the circular buffer to be replenished while transmission occurs. That's way beyond my game, really. I can imagine that a Task launched in the constructor could monitor the circular buffer and keep it close to capacity. What do you think? Or something along those lines: https://code.msdn.microsoft.com/Custom-WCF-Streaming-436861e6

InvalidCastException When Serializing Char Type

Hi Richard,

I think I've found another issue during integration testing the extension: in the ProtoDataWriter.Serialize method when I try to serialize a column which is a type of char, it will cause an InvalidCastException.

I investigated the issue a little bit, and found out, that the problem is that the ProtoWriter tries to serialize the boxed char value as an Int16 value. But according to this blog entry (http://blogs.msdn.com/b/ericlippert/archive/2009/03/19/representation-and-identity.aspx) a boxed type can be unboxed only as the same type. Which is in contrast what's happening here: with the explicit casting to Int16 we just tell the compiler to unbox this object to an Int16 despite of that this object is a char. The possible solution would be a double cast (according to this entry: http://social.msdn.microsoft.com/Forums/sk/csharpgeneral/thread/3748ddb4-0adc-4cad-879c-b165b2341c8a): (Int16)(char)value or just simply use the Convert.ToInt16() method (although it could be slower I think).

Thanks,
Laszlo

Support data type 'System.DateTimeOffset'

Can you support data type System.DateTimeOffset
I get error:

         Cannot serialize data column of type 'System.DateTimeOffset'. 
       Only the following column  types are supported: 
       Boolean, Byte, Byte[], Char, Char[], DateTime, Decimal, Double, Guid, Int16, Int32, Int64, Single, String, TimeSpan.

Serialize/Deserialize Zero-Length Array as Is

Hi,

At our company we are thinking about using ProtoBuf-Net with this extension to serialize/deserialize IDataReader from/to the NHibernate. This could be connected to an AppFabric Cache solution in our internal framework. Because of this (ie. it's a framework which will be used by others) we don't know currently all of the possible demands therefore we have to support zero-length arrays as well. But the extension is not serializing those therefore the deserialization will pass back nulls (as you noted in the readme file) which means that the data before and after the serialization won't be the same.

Unfortunately I don't know the reasons behind of this decision, but it would be a big help to us, if this extension would support the zero-length arrays as well.

Thanks,
Laszlo

More test cases

Need some more tests with different combinations of data types/nullness/nesting etc.

We need Metadata also.

Metadata about unique keys, auto increment, default value, base table name, data provider, data relations etc is ignored

Is it serialised entirely into memory?

Hi, thanks first for producing this groovy component. We are using it to serialise and deserialise data readers either side of a WCF service. I have a question though (apologies if it's a bit stupid) - in order to allow the caller of the service to use the resultant stream, I have to set the position back to 0:

///
/// TestDataService implements the WCF service interface.
/// </summary>


[ServiceBehavior(Namespace = "http://www.myservice.com/20121128/1/TestDataService")]
public class TestDataService : IReportingDataService
{
    /// <summary>


    /// Retrieves the data for a specified report and it's options
    /// </summary>


    /// <param name="request">A GetReportDataRequest object.</param>
    /// <returns>A GetReportDataResponse object.</returns>
    public GetReportDataResponse GetReportData(GetReportDataRequest request)
    {
        MemoryStream stream = new MemoryStream();
        ProtoBuf.Data.DataSerializer.Serialize(stream, DataReaderHelper.GetDataReader(100, 10));
        stream.Seek(0, SeekOrigin.Begin);
        return new GetReportDataResponse()
            {
                DataReaderStream = stream
            };
    }
}

This looks to me like the entire datareader is serialised into the memory stream - would that be right?

Thanks for all your hard work.

ProtoDataReader.GetOrdinal case-sensitivity bug

After upgrading to 3.0.0 we noticed that ProtoDataReader.GetOrdinal is now case-sensitive, where it used to be case-insensitive (in 2.1.1).

The documentation for the IDataRecord interface (https://docs.microsoft.com/en-us/dotnet/api/system.data.idatarecord.getordinal?view=netstandard-2.0#remarks) says:

GetOrdinal performs a case-sensitive lookup first. If it fails, a second case-insensitive search is made. GetOrdinal is kana-width insensitive.

I've written some tests for case-insensitivity, kana-type-insensitivity and character-width-sensitivity, and I've implemented a fix to make it work like before (and to make it consistent with the behaviour seen in the FieldNameLookup source in dotnet corefx). I'll raise a PR for these.

howto Connection.Close() in WCF transferMode="Streamed" and ProtoDataReader

Hi. ( I do not speak English well. Want you to understand. :) )
I have a WCF project.
Calling a WCF service, and 500,000 rows of data is to obtain. Want a quick way possible.
,
My WCF Host Service Is.

    public Stream GetData()
    { // ODP.NET
        OracleCommand cm = null;
        string commandText = string.Empty;

        Stream drdStream = null;
        string connStr = ".....";

        try
        {
            // RowCount = 579,563
            //commandText = "SELECT * FROM ***** ";

            cn = new OracleConnection(connStr);
            cm = new OracleCommand(commandText, cn);

            cn.Open();
            OracleDataReader drd = cm.ExecuteReader();

            drdStream = new ProtoDataStream((IDataReader)drd);
        }
        catch (System.Exception ex)
        {
            Console.WriteLine("Exeption {0}", ex.Message);
        }
        finally
        {
            if (cm != null)
            {
                cm.Dispose();
                cm = null;
            }
            //if (cn != null)
            //{
            //    cn.Close();
            //    cn.Dispose();
            //    cn = null;
            //}
        }

        return drdStream;
    }

Client Code is.
// proxy is netTcpBinding, IIS 7.5 WAS Hosting, transferMode="Streamed"

            ProtoDataReader pdr = null;
            try
            {
                Stream s = proxy.GetData();
                pdr = new ProtoDataReader(s);

                while (pdr.Read())
                {
                    Console.WriteLine(string.Format("'{0,-6}' '{1}' '{2}' '{3}' '{4}' '{5}'", iCnt++, pdr.GetValue(0), pdr.GetValue(1), pdr.GetValue(2), pdr.GetValue(3), pdr.GetValue(4)));
                }
            }
            catch (System.Exception ex)
            {
                Console.WriteLine(ex.Message);
            }
            finally
            {

            }

It works very well.

DataReader is passed to the client because the server was able to save resources.
And a response to the query was able to quickly.
Thank you.

But ... After using the DataReader Connection could not find a way to Close.
In these conditions, the Close DB Connection must find a way to.
Please help.

ProtoDataStream to work as cursor

We have the following scenario:

-Service and Client on the same box and using net.pipe binding to talk to each other
-service reads a huge DataTable that could be ~4Gb or even more
-consumer needs to receive this DataTable
-the communication happens using WCF contract like Stream GetData()
-we are using ProtoBuf.Data.ProtoDataStream for table serialization/deserialization

The problem:

  • both client and server are on one box so there are times when the RAM used by DataTable is doubled.

If there a way to read from the service and delete read data on a service side ?

DataTable as member of serialized class

Can I use this project to serialize something like this:

[ProtoContract]
class aClass
{
    [ProtoMember(1)]
    public string aString;

    [ProtoMember(2)]
    public DataTable aTable;        
   }

I've tried to do it but am getting a "No serializer defined for type: System.Data.DataTable
" error.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.