<!-- 
RSS generated by JIRA (8.3.4#803005-sha1:1f96e09b3c60279a408a2ae47be3c745f571388b) at Sat Feb 10 16:20:43 JST 2024

It is possible to restrict the fields that are returned in this document by specifying the 'field' parameter in your request.
For example, to request only the issue key and summary append 'field=key&field=summary' to the URL of your request.
-->
<rss version="0.92" >
<channel>
    <title>PFS-JIRA</title>
    <link>https://pfspipe.ipmu.jp/jira</link>
    <description>This file is an XML representation of an issue</description>
    <language>en-us</language>    <build-info>
        <version>8.3.4</version>
        <build-number>803005</build-number>
        <build-date>13-09-2019</build-date>
    </build-info>


<item>
            <title>[INSTRM-24] VIS CU Red 1 -- No turbo pump status anymore</title>
                <link>https://pfspipe.ipmu.jp/jira/browse/INSTRM-24</link>
                <project id="10300" key="INSTRM">Instrument control development</project>
                    <description>&lt;p&gt;We lost the status of the turbo pump since the 2016/11/19 04:27&lt;/p&gt;

&lt;p&gt;the cmd result of the xcu_r1 turbo status is:&lt;/p&gt;

&lt;p&gt;12:51:59.483627 .hub 0 cmds d CmdIn=&quot;client.pfs&quot;,&quot;xcu_r1&quot;,&quot;turbo status&quot;&lt;br/&gt;
12:51:59.484879 .hub 0 cmds d CmdQueued=4766873,1479732719.48,&quot;client.pfs&quot;,54,&quot;xcu_r1&quot;,426,&quot;turbo status&quot;&lt;br/&gt;
12:51:59.495176 client.pfs 54 xcu_r1 d text=&quot;sending &apos;?V852\r&apos;&quot;&lt;br/&gt;
12:51:59.496299 client.pfs 54 xcu_r1 w text=&quot;failed to create connect or send to turbo: &lt;span class=&quot;error&quot;&gt;&amp;#91;Errno 111&amp;#93;&lt;/span&gt; Connection refused&quot;&lt;br/&gt;
12:51:59.497489 client.pfs 54 xcu_r1 f text=&quot;command failed: error(111, &apos;Connection refused&apos;) at /home/pfs/anaconda/lib/python2.7/socket.py:228&quot;&lt;br/&gt;
12:51:59.498529 .hub 0 cmds d CmdDone=4766873,&quot;f&quot;&lt;/p&gt;

&lt;p&gt;12:58:59.675700 .hub 0 cmds d CmdIn=&quot;client.pfs&quot;,&quot;xcu_r1&quot;,&quot;status&quot;&lt;br/&gt;
12:58:59.676937 .hub 0 cmds d CmdQueued=4767133,1479733139.67,&quot;client.pfs&quot;,55,&quot;xcu_r1&quot;,427,&quot;status&quot;&lt;br/&gt;
12:58:59.686857 client.pfs 55 xcu_r1 w text=&apos;pathetic version string: unknown&apos;&lt;br/&gt;
12:58:59.687879 client.pfs 55 xcu_r1 i version=&quot;unknown&quot;&lt;br/&gt;
12:58:59.689077 client.pfs 55 xcu_r1 i text=Present!&lt;br/&gt;
12:58:59.689966 client.pfs 55 xcu_r1 i text=&quot;monitors: &lt;/p&gt;
{String(&apos;turbo&apos;): Int(15), String(&apos;roughGauge1&apos;): Int(15), String(&apos;cooler&apos;): Int(15), String(&apos;temps&apos;): Int(15), String(&apos;gauge&apos;): Int(15), String(&apos;gatevalve&apos;): Int(60), String(&apos;ionpump&apos;): Int(15)}
&lt;p&gt;&quot;&lt;br/&gt;
12:58:59.690994 client.pfs 55 xcu_r1 i text=&quot;config id=0x7f42e0733cb0 &lt;span class=&quot;error&quot;&gt;&amp;#91;&amp;#39;tron&amp;#39;, &amp;#39;xcu_r1&amp;#39;, &amp;#39;pcm&amp;#39;, &amp;#39;cooler&amp;#39;, &amp;#39;temps&amp;#39;, &amp;#39;turbo&amp;#39;, &amp;#39;rough1&amp;#39;, &amp;#39;roughGauge1&amp;#39;, &amp;#39;ionpump&amp;#39;, &amp;#39;logging&amp;#39;&amp;#93;&lt;/span&gt;&quot;&lt;br/&gt;
12:58:59.692134 client.pfs 55 xcu_r1 : controllers=turbo,roughGauge1,cooler,pcmUdp,temps,gatevalve,PCM,ionpump&lt;br/&gt;
12:58:59.693180 .hub 0 cmds d CmdDone=4767133,&quot;:&quot;&lt;/p&gt;


</description>
                <environment></environment>
        <key id="11344">INSTRM-24</key>
            <summary>VIS CU Red 1 -- No turbo pump status anymore</summary>
                <type id="1" iconUrl="https://pfspipe.ipmu.jp/jira/secure/viewavatar?size=xsmall&amp;avatarId=10503&amp;avatarType=issuetype">Bug</type>
                                            <priority id="3" iconUrl="https://pfspipe.ipmu.jp/jira/images/icons/priorities/major.svg">Major</priority>
                        <status id="10002" iconUrl="https://pfspipe.ipmu.jp/jira/images/icons/statuses/generic.png" description="The issue is resolved, reviewed, and merged">Done</status>
                    <statusCategory id="3" key="done" colorName="green"/>
                                    <resolution id="10000">Done</resolution>
                                        <assignee username="cloomis">cloomis</assignee>
                                    <reporter username="fmadec">fmadec</reporter>
                        <labels>
                    </labels>
                <created>Mon, 21 Nov 2016 12:59:59 +0000</created>
                <updated>Thu, 18 May 2017 12:04:15 +0000</updated>
                            <resolved>Mon, 21 Nov 2016 13:17:21 +0000</resolved>
                                                                        <due></due>
                            <votes>0</votes>
                                    <watches>3</watches>
                                                                <comments>
                            <comment id="11600" author="cloomis" created="Mon, 21 Nov 2016 13:14:24 +0000"  >&lt;p&gt;There is a tcp-to-serial program (&lt;tt&gt;tcp_serial_redirect.py&lt;/tt&gt;, launched at boot) running on the BEE which got killed because it was logging too much and filled up the disk. The logfile (&lt;tt&gt;/home/pfs-data/nohup.log&lt;/tt&gt;) has been deleted and the program restarted.&lt;/p&gt;

&lt;p&gt;In any case the entire mechanism is an engineering convenience which is no longer needed (it allowed running the entire actor on a machine other than the BEE). We can remove the program during the upgrade at the end of the week.&lt;/p&gt;

</comment>
                            <comment id="11601" author="cloomis" created="Mon, 21 Nov 2016 13:17:21 +0000"  >&lt;p&gt;Hack fixed. Real problem to be removed.&lt;/p&gt;</comment>
                            <comment id="11602" author="arnaud.lefur" created="Mon, 21 Nov 2016 14:48:11 +0000"  >&lt;p&gt;&lt;blockquote&gt;&lt;p&gt;In any case the entire mechanism is an engineering convenience which is no longer needed (it allowed running the entire actor on a machine other than the BEE). We can remove the program during the upgrade at the end of the week. &lt;/p&gt;&lt;/blockquote&gt; . I agree with that.&lt;br/&gt;
I&apos;ve just power off and on the bee 10 minutes ago and restarted the actor. &lt;br/&gt;
Basically, I don&apos;t know if the problem was solved before, but if you say so I trust you .&lt;/p&gt;

</comment>
                            <comment id="12166" author="arnaud.lefur" created="Thu, 18 May 2017 10:22:13 +0000"  >&lt;blockquote&gt;&lt;p&gt;In any case the entire mechanism is an engineering convenience which is no longer needed (it allowed running the entire actor on a machine other than the BEE). We can remove the program during the upgrade at the end of the week. &lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;Does that was actually done ?&lt;br/&gt;
Because, when I started &lt;em&gt;bee-r2&lt;/em&gt;, I wasn&apos;t able to communicate with the turbo, so I&apos;ve looked on &lt;em&gt;bee-r1&lt;/em&gt; and I&apos;ve seen that this tcp_serial_redirect.py was still running.&lt;/p&gt;

&lt;p&gt;I did a &lt;b&gt;python /home/pfs/tcp_serial_redirect.py -p /dev/ttyS2 -b 9600 -P 4001&lt;/b&gt; and it worked, but was it suppose to work without doing that ?&lt;/p&gt;</comment>
                            <comment id="12168" author="cloomis" created="Thu, 18 May 2017 11:17:52 +0000"  >&lt;p&gt;There is one turbo per cryostat, so we still need a redirector running on each bee. Or did I misunderstand you? &lt;/p&gt;</comment>
                            <comment id="12169" author="arnaud.lefur" created="Thu, 18 May 2017 12:02:53 +0000"  >&lt;p&gt;Yes, maybe I wasn&apos;t clear.&lt;/p&gt;

&lt;p&gt;If I understand well what you do with this script is basically what a moxa does : you can communicate with the turbo through a tcp connection with the bee ip and port 4001.&lt;br/&gt;
It allows you to communicate with the turbo within the pfs network and as a result you can run the xcuActor from anywhere, ( it&apos;s what we want to do for enuActor also).&lt;/p&gt;

&lt;p&gt;But I though that it was only for engineering as you said &lt;blockquote&gt;&lt;p&gt;In any case the entire mechanism is an engineering convenience which is no longer needed &lt;/p&gt;&lt;/blockquote&gt;&lt;br/&gt;
I assumed that you would remove this serial=&amp;gt;ethernet conversion and switch back to a direct serial connection for controlling the turbo on the bee.&lt;/p&gt;

&lt;p&gt;Using this &lt;b&gt;tcp_serial_redirect.py&lt;/b&gt; works well and is totally fine. But I just needed a clarification about your comment to understand if what we currently use is meant to kept or not.&lt;/p&gt;

</comment>
                            <comment id="12170" author="cloomis" created="Thu, 18 May 2017 12:04:15 +0000"  >&lt;p&gt;Currently keep it, yes.&lt;/p&gt;</comment>
                    </comments>
                    <attachments>
                    </attachments>
                <subtasks>
                    </subtasks>
                <customfields>
                                                <customfield id="customfield_10500" key="com.atlassian.jira.plugins.jira-development-integration-plugin:devsummary">
                        <customfieldname>Development</customfieldname>
                        <customfieldvalues>
                            
                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                            <customfield id="customfield_10010" key="com.pyxis.greenhopper.jira:gh-lexo-rank">
                        <customfieldname>Rank</customfieldname>
                        <customfieldvalues>
                            <customfieldvalue>0|ii011j:</customfieldvalue>

                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                                                        </customfields>
    </item>
</channel>
</rss>