Forum Replies Created
-
AuthorPosts
-
kcks66
ParticipantHi Mikhail,
I checked all the Calculated Channels data were not uploaded into Target PC.
Below is data from DAT files before upload:
Below is data from PostgreSql after upload:
Is this normal?
Thank you.kcks66
ParticipantHi Mikhail,
I found many similar of this:
2024-04-18 16:28:18 Error transferring historical data:
System.IO.IOException: Unable to read data from the transport connection: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond..
—> System.Net.Sockets.SocketException (10060): A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond.
at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 count)
— End of inner exception stack trace —
at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 count)
at Scada.Client.ClientBase.ReceiveResponse(DataPacket request)
at Scada.Client.ScadaClient.WriteChannelData(Int32 archiveMask, ICollection`1 slices, WriteDataFlags flags)
at e.j()kcks66
ParticipantHi Mikhail,
However, I found there is data lost after upload and convert into Postgresql data.
Below is data from DAT files before upload:
Below is data from PostgreSql after upload:
Pls advice for did I miss any steps or wrong setting?
Thank you.kcks66
ParticipantHi Mikhail,
I manually create partition in pgAdmin as your advice. It’s worked.
Looks like my data is huge. It takes long hours to transfer few days data. I have data from 2020 to 2024 to transfer, I wonder how long will it takes to finish them.
kcks66
ParticipantHi Mikhail,
That’s great I can see historical data uploading status in the RapidGate log files. However, there is problem occur at the Target PC side:
Error message:
2024-04-17 17:48:23 Error writing to database:
Npgsql.PostgresException (0x80004005): 23514: no partition of relation “min_historical” found for rowDETAIL: Detail redacted as it may contain sensitive data. Specify ‘Include Error Detail’ in the connection string to include this information.
at Npgsql.Internal.NpgsqlConnector.<ReadMessage>g__ReadMessageLong|223_0(NpgsqlConnector connector, Boolean async, DataRowLoadingMode dataRowLoadingMode, Boolean readingNotifications, Boolean isReadingPrependedMessage)
at Npgsql.NpgsqlDataReader.NextResult(Boolean async, Boolean isConsuming, CancellationToken cancellationToken)
at Npgsql.NpgsqlCommand.ExecuteReader(CommandBehavior behavior, Boolean async, CancellationToken cancellationToken)
at Npgsql.NpgsqlCommand.ExecuteReader(CommandBehavior behavior, Boolean async, CancellationToken cancellationToken)
at Npgsql.NpgsqlCommand.ExecuteNonQuery(Boolean async, CancellationToken cancellationToken)
at Scada.Server.Modules.ModArcPostgreSql.Logic.PointQueue.ProcessItems()
Exception data:
Severity: ERROR
SqlState: 23514
MessageText: no partition of relation “min_historical” found for row
Detail: Detail redacted as it may contain sensitive data. Specify ‘Include Error Detail’ in the connection string to include this information.
SchemaName: mod_arc_postgre_sql
TableName: min_historical
File: execPartition.c
Line: 335
Routine: ExecFindPartitionPls advice and thank you.
kcks66
ParticipantHi Mikhail,
I re-setup another PC with RapidGate module. The Upload options appear in this PC.
Pls advice for how should I configure to upload all historical data in DAT format from this PC to store in PostgreSql database at target PC?
Thank you.
kcks66
ParticipantHi Mikhail,
For my case which I want to convert historical data from DAT format to store in PostgreSql database which option should I use?
What should I do to make the Upload options that no display in my RapidGate to reappear back?
kcks66
ParticipantHi Mikhail,
I finally managed to setup another PC with Rapid SCADA configuration exactly same with the target PC Rapid SCADA. I managed to login target PC via RapidGate. The next step is pls advice for which part I should configure?
https://ibb.co/L00KSv0
This Historical Data Transfer Option?https://ibb.co/GtXdzWb
This Achieve Replication Option?FYI, all channels from Target PC and RapidGate PC are configured exactly same Channel ID. Target PC was configured with Archives set at ModArcPostgreSql.
Thank you.
kcks66
ParticipantHi Mikhail,
> However, why don’t you use *.dat format as a secondary to make the system more reliable? Yes, I will consider this.
Thanks for your suggestion. Since my project is non-commercial value, I would like to explore > Deploy an additional instance of Rapid SCADA and use Rapid Gate module to transfer data from the additional instance to the main instance of Rapid SCADA.
I will get another PC to setup another Rapid SCADA and load with all the *.dat files. Will try it out and raise questions when the setup is ready.
Thank you.
kcks66
ParticipantHi Mikhail,
Thanks. I tried all of them and they worked accordingly.
After I discussed and showed to the system users, we finally decided to use PostgreSql as only one achieve as main achieve.
My next step is to migrate all the *.dat files (in v6 format already) into Postgresql database. Pls advice any faster or easier way to do such migration?
Thank you.
kcks66
ParticipantHi Mikhail,
I would like to explore all of these advice.
1) How shall I configure the chart profile to open data from *.dat archive?
2) Am I need to enable/true the below?
<!– User can select archive –>
<SelectArchive>false</SelectArchive>3) I need to active both set of Achieve like below?
Thank you.
kcks66
ParticipantHi Mikhail,
Previous Rapid SCADA v5 I use .dat as my main achieve. Right now after I migrated to v6 I want to use ModArcPostgreSql as my main achieve.
Those .dat files from v5 I used the converter to convert to v6 .dat files format, can be trend by modArcBasic only, right?
Pls advice for what should I do after I set ModArcPostgreSql as my main achieve and still can trend back those converted historical data previously from v5?
Thank you.
-
This reply was modified 1 year, 6 months ago by
kcks66.
kcks66
ParticipantHi Mikhail,
I can see the new converter can continue to run after encountered those data more than 1440 per day.
kcks66
ParticipantHi Mikhail,
Today the PC encountered actual power supply tripped. The PC auto resume after around 30 minutes later. I checked Rapid SCADA and everything is working well.
I think the Automatic (Delayed Start) setting worked.
kcks66
ParticipantHi Mikhail,
I can see the error .dat file contain two data per one minute, if compared to other success .dat file contain one data per minute:
Any .dat file with more than 1440 will be treated as error.
-
This reply was modified 1 year, 6 months ago by
-
AuthorPosts